This is pretty far down the list of interesting things to code, but would it be possible to create a visualiser for the slicing objects? Something in jsui would be neat where you pass it the source buffer and slices buffer as arguments and it creates a visualisation showing you the location of segments. My very limited GUI imagination would just have the bipolar waveform with vertical lines running through it.
My reasoning for this is tuning the parameters can be tiresome when you have to check aurally what the slices are like, where they are and how well they fit your idea of what is salient in segmentation.
EDIT:
I promise I wasnāt too lazy and did give it a whack with lcd and jsui but they were doing my head in, so deferring to those who are more fluent/proficient
Can also just be a tie in with my suggestion for a better multichannel waveform display, especially since the slice point UI/UX is already in the warpy2.js file (though attached to stretching the buffer around.
Maybe a tweak on that would be using different āpointersā at the top to designate the start and end of a slice (if future segmentation objects give onset and offsets).
Man, that subpatch for p fill_markers is confusing!
Thankfully a simple message of insertMarker2 $1 with $1 = phase position, is all the warpy op_warpy.js needs.
Any chance that that functionality (insertMarker) can get migrated into the (hopefully included) āgeneral purposeā multichannel display jsui?
That way we donāt need a separate jsui for each type of thing weāre doing, plus, one might want to run fluid.transientslice~ on the output of fluid.bufnmf~.
I made an abstraciton which you can connect to the jsui for fast visualisation. This is basically a way of tapping into @pasquetjeās work in a bit more straightforward manner
Ok, played with it more now (I couldnāt get it to work at all before).
Some of the math was wrong in the abstraction (you were dividing by list length instead of the duration of the source buffer, meaning your first and last indices were 0. and 1., regardless of what they actually were).
Also made it so it trims off the last 2 slices (and doesnāt render the first) so you only load transients and not boundaries.
This is quite handy as you can visually ātuneā the threshold(s) for what looks appropriate (ala Liveās UI for slicing).
Would be good to tune it to be more accurate (i.e. seeing the window offset), but Iām now not sure if the window offset is in samples or ms (so I made a thread about it!). Sadly the compensation wonāt be generic as fluid.bufnoveltyslice~ doesnāt work the same way.
I never find js to be too great in Max for things like this.
Itās not instant or anything, but it doesnāt seem like it takes crazy long (nowhere near as long as the transient stuff obviously)
Not taken a close look at the guts of the jsui, but I presume (hope) itās downsampling the buffer before rendering it, otherwise that could take a while, particularly with longer multichannel audio.
Ah right, I only tried the test example on brushes.aif. So if that took around 100ms, then something that long would pinwheel.
I would think it would be easy to downsample the calculation in the jsui.
Hereās a Max-based version of a similar thing which I have saved somewhere. Itās pretty brutal here downsampling to 64 āstepsā, but less brutal versions look better.
Or the abstraction can do this in Max-land and then pass a downsampled buffer into the jsui.
We have been talking at several moments about some hack or proper programming of slice visualization in Max. @weefuzzy had something brewing which was almost ready for distribution, but I donāt seem to find a link to some useable / testable example.
Is there any update on that front?
I know we can do this in Reaper with the great tools @jamesbradbury has provided, but Iām looking for a way to keep things inside Max right now.
Thanks, Hans
Not that Max has a great UI or anything, but a couple weeks ago I was using Reacoma to segment some audio, and just navigating stuff back-and-forth like that is a bit tricky and slow, when trying to dial in values.
For what itās worth, the code of warpy is already in Max so it should not be difficult to adapt it to what you want. I use a very garage view in pd where I make a buffer with 1s just at cutting point and link the zoom, which again works well.
The point of ReaCoMa is not to have quick back and forth. This would only be reasonably possible if segmentation was separated from the calculation of novelty, amplitude curves for example. When/if it does Iāll make something better (: