Time stretch in Max

I’m sure @a.harker will let us know where is multiresolution framelib experiment lead him. Last time I heard it was long ago, it was very promising, if still a little skewed spectrally. That was a few years ago (how time flies!)

Just to wrap this up on the SC side, I finally got this “right” and the python version is now fully functional thanks to Alex Ness and Jem Altieri.

2 Likes

I did a freeze-type thing with framelib based on one of the examples (not a time-stretch, but related), which exists as a plug-in here for anyone who is interested.

https://drive.google.com/drive/folders/13gp-HqqnreMMZ79Pi-a9njq3ZfrkDU_i?usp=sharing

1 Like

This sounds great. Can you explain the morph? Also, do we all need to make manuals now? I’m not doing that!

I assume it’s maybe something like vectral that is smoothing the changes in magnitudes whilst still generating the phases using the Gaussian rng.

The topology is something like this

  • the input is “blurred” by averaging spectral frames in terms of magnitude and phase deltas over time - thus the blur control alone smears time before it’s gone into the freeze type bit.
  • there is then a system that triggers sampling of the output of the blur, randomly, regularly or “freeze” (stop sampling).
  • the fragment control means you can set it to only sample random chunks of the spectrum (rather than the whole thing as happens at zero percent).
  • then the morph is a iinear interpolation between the previously sampled values and the new ones. If the sample keeps triggering that can throw off the timing, because each time it samples the previously sampled values get set to whatever the last output was, so you get into a xeno’s paradox type situation.

resynthesis is stochastic, in that the output of the previous stuff is a mean for the mag / phase delta but also a standard deviation for each (I skipped those for simplicity). So there are four values per bin from which you generate two gaussian randoms (one for mag and one for phase delta) and then you do a standard phase accumulation phase vocoder to keep continuous phases.

The plug-in DSP is all written in frame lib as a max patch and exported to C++ and then glue/GUI added.
The manual is because this is something we are giving out to prospective students, so it needed to exist - took less than an hour to write though I think.

Yes but the phase deltas are also smoothed. And it’s not the mags/phase deltas directly but the means/std dev that are used to generate them.

All code (including the patch) is here:

1 Like

Is that similar to the stochastic freeze example in Framelib?

Is there a good source to learn how to turn a max patch into a VST?

Thank you for this. In the Analysis Max Patch, it looks like you are taking the real and imaginary numbers, making the mag and phase, and then doing the maths on those, no?

Also, what does the MS processing do? Why do that before and after the FFT?

I imagine this might go something into that:

I saw that and was confused too, but I think it’s MS as in mono/stereo, rather than mid/side.

That is correct.

Yes - the maths is done on polar values, although for the phase delta averaging, I actually take the deltas back into cartesian and do angular averaging in cartesian form (based on something I read on the internet).

The MS processing is exactly what is sounds like on the tin, but it isn’t used directly in the plug-in (which just has an MS adjustment post-effect). Originally, I was hoping to get better imagining doing the process on the M and S signals, and you do get a better central image, but it turns out I’m not tracking the relative phase of the M and the S, so “side” might mean left or right, so you lose the stereo image proper and I abandoned it. That part of the patch doesn’t get exported so I didn’t bother removing it.

I should clarify. Rod is right that I used iPlug2, so learning that (which I also work on with Oli) is not a bad idea for a quick environment to make plug-ins.

However, the way I get from the patch to the plug-in is not possible generically for Max. I use an “export” feature of Framelib that turns the coded network directly into a C++ class, and that’s what forms the basis of my plug-in. You can (and I have) do that sort of thing also with gen~ (the Diffuse plug-in from surreal machines is done that way). @tutschku - and or others happy to go through plug-in stuff at some point from an iPlug perspective using some online format.

2 Likes

That would be useful I think. Especially as I have stepped into the DAW more enthusiastically in the last 1.5 years I do miss having access to some of the processes I wrote in Max.

1 Like