Just Too Happy Not To Share EDIT: Pd solution too!

Ok, so the Max users will have seen, in the example folder, a document called nb_of_slices. It was a sort of attempt to specify the number of slices I want, and get Max to iteratively guestimate the threshold needed to make it work.

I was dumb, as I was inversing the error, and adding a bit of noise to avoid oscillation in my feedback. Cybernetic 101 (actually, probably 001). it kinda works, but just kinda. It made my teacher @weefuzzy smile.

For my piece for Wet Ink, I needed it to work all the time. So I decided to do something a bit more intelligent. It is still probably at the bottom of the list of what one can do, but it does a sort of linearization of the problem by taking the last 2 thresholds and how many slices they yield, and interpolate linearly to a guestimated value. And does that over until it reaches the right amount. And it works.

So I coded it in SuperCollider too. And found a few bugs in the Max/JS version, so fixed it all I think. So now I struggle to think of the equivalent of JavaScripting in Pd. I might have to do dirty iterative code in actual Pd, but in the meantime, here is the toy in Max and SC for everyone to have a go at it, and probably find more bugs.

I’ve commented the code, so it should be self explanatory, but please feel free to ask questions and/or suggest improvements.

p

SC: nb_of_slices.scd.zip (1.0 KB)
Max: Archive.zip (6.2 KB)

1 Like

This is cool.

Not played with it too much yet, but I did find that if you (mistakingly) send a negative number to threshfinder.js, it not only spins out indefinitely, but it also will return bad answers thereafter if you request a reasonable amount of slices.

Probably worthwhile internally clamping the values in the .js (as well as the numbox in Max-land).

1 Like

This is super easy to follow code and such an important utility imo. What is super nice about having this in Max land is that you can do all the testing in memory whereas with the CLI I have to write/read from disk which is slower, albeit not that much given ubiquity of SSDs.

What made you come to the linear interpolation approach? You have inspired me to inject this functionality to the REAScripts but I just went with a brute force method, given an aggressiveness of how far it searches. I then add noise to the exploration amount so that you dont get stuck in a loop of * 2, / 2 ,* 2.

EDIT:

If you want to peek :slight_smile: https://github.com/jamesb93/REAPER-Scripts/blob/auto-param/flucoma/noveltyslice-auto.lua

1 Like

I definitely need to make it sturdier in term of clamping between almost 0 and almost 1…

Thanks!

@weefuzzy’s smiles. I needed something more intelligent than random :wink: It is only taking the last 2 points, which in effect hones in. A sort of intuition of what I actually do when I wiggle that knob.

The SuperCollider code is more elegant and clear I think, I might do a rewrite.

I did not need noise in the linear one, just a condition that if it does not change, it pushes further. Maybe I need to test further…

I looked at your code, and it is very similar to my first approach in the Max folder, the one that @weefuzzy found beautifully brutal. The linear interpolation converges a lot better on difficult cases. For instance, try (in Max) to segment on pitch and you will see the thresholds are so finicky I did not get any results with the noise approach… hence the idea of ‘honing in’ with naive linear interpolation. I reckon I could keep all points to make a better regression, but that’ll be my next homework :wink:

ok i’ve updated the links above with the new versions:

  • now clamped to avoid naughty values
  • now has a tolerance: you can allow +/- number of slices , by default 0, because, you know, control is fun

p

This is the version in Pd, using [value]s and the ability to use them in [expr]. Pd is growing on me. The patch is ugly and underdocumented, so any good Pd user wanting to contribute to my learning, feel free to tackle it!

p
nb_of_slices.pd.zip (2.0 KB)