Get parallel time-windows of multivariate time series [Max/JS]

Hey! There isn’t any flucoma-stuff in the following, but I figured it might be useful in the context. I am working on some mlp-mapping between a depth camera and my flucoma-powered granular system. @tremblap suggested me to represent features in different time-windows to the mlp, but I found it a bit tedious to do this in Max when there are many features present. (Didn’t really get deep in the how but as I understood it would be a zl.streammean combo for each feature for each time-window - maybe there is a better trick?)
So I made a little js-script/abstraction that does the job, meet bl.timewindow. Could be useful for some mlp-business. As always, suggestions are welcome to make it better.

Download:
bl_timewindow.zip (4.4 KB)

3 Likes

Using bufstats~ to do this would be tedious, unless you js scriped the creation of fluid.bufstats~ objects / arguments and iterated over their function which leads you back to the same place you’ve arrived at. (:

Maybe you would be interested or already are aware of the sktime — a python package for dealing with time series data and machine learning.

There are some nifty examples such as classification with time series data. I’ve not jumped in too much myself, but I think that in general we (techno-fluent musician in flucomaspeak) often rely a lot or revert to using stateless data (summaries of time invariant things) and there could be something juicy in working on time series that represent the morphological and variant aspects of our sounds.

2 Likes

Wow, thanks a lot, James!

Thanks! Wasn’t aware of it, this is very nice! Have to study a bit, but it actually looks straightforward.

Great point and I am sure this is the case, though I have only a narrow experience in the topic. I did 2 projects with LSTMs (in tf.keras), and though it was a bit of a pain to train, the LSTMs were definitely performing much better then regular MLPs (“dense layers”) with the shifting window trick. I am planning to get back to this forever and to make some tensorflow.js-based model loader/player, and wrap it in a Max abstraction.
But so far I got totally “distracted” (in the good way) with the fluid.mindset of doing ML in Max, and also learned a ton of things (mostly from @tremblap ) that I most certainly should have known back in my Python-experiments as well… :))

1 Like