NMF in Max 4 Live device

I just heard on sonic state podcast about a new version of this m4l device that basically uses NMF. I like the interface a lot, a lot of work seems to have gone into the UI. I’m tempted to get it, as soon as cash flow improves :slight_smile:

This has raised a few lingering questions I have had about flucoma and how easy/hard it is to bring it to a max 4 live device. While I love playing in Max I find my creative juices get more going in the Live environment.

I’ve thought about adapting some of the algorithms for M4L, similar to what @jamesbradbury has done with ReaCoMa but the Live API is so unpleasant, and I have no idea how so much of the workflow would even work in Live (slicing, segmenting, layering, etc…).

1 Like

I’ll be doing some M4Ling very soon, for a piece I’m working on. Like @rodrigo.constanzo, the API makes me sad, but I’ll be brave etc. Will post updates here.

I’ve already used NMF in a M4L device (in a very stupid way), and it was fine.

My collaborator @d.murray-rust has made a much more delicious auto sampler in M4L using the novelty slicing stuff.

2 Likes

Are you (or @d.murray-rust) dumping layers, slices, or whatever back into Live? I’ve got some abstraction stuff that can read and write clips, but that seems super tedious to take a long clip, and turn it into potentially dozens of clips that correspond with slices. Same goes for layers/objects etc…

I’m not a huge Live user, so I don’t know if there’s some kind of ‘takes’ interface that I don’t know about, or markers/pointers or something like that, but within the clip-based paradigm those processes seem like a nightmare!

That being said, I hate live.object.

Not dumped anything into clips yet, and I don’t think Dave has either, rather than just maintinaing buffers~ that are pinged by different devices. It does look like there’s a lot of boilerplate to deal with there, but I may well feel moved to try at some point!

That’d be the rub I think. Getting it to work in Live-y manner.

I love this discussion but I wonder if we should move it into a properly named topic :smiley: I feel this is something many max users that use Ableton live can learn from !

ps: I share with you all the grief with trying to develop m4l

Actually, that being said.

I think the inverse may be true for all the TB2 bits. Where “mapping” is a big part of the Live interface, and Reaper seems to be less oriented around that (though I could be wrong as I’m not a big Reaper user).

Probably too early to tell, but curious what kind of interface things @jamesbradbury is thinking about for TB2 stuff.

Nothing yet. I think the interface right now doesn’t support a CLI build anytime soon especially as things are changing rapidly. If I was going to wrap up any kind of functions right now it would just be dataset building from REAPER (which I don’t personally need) or I would step into Python.

2 Likes

I have no idea why the M4L objects are so painful. It’s really sad :confused:

I’ve been making a collection of abstractions for grabbing audio in different places. It needs another go through of being kicked into shape, but it’s here: https://github.com/mo-seph/grab.network

This makes it relatively easy to use a novelty curve to grab buffers and feed them into other things, so with @weefuzzy I made a novelty based grabber that feeds into my granular tools, whether to ping off granular sounds automatically, or replace the samples that I’m in the middle of playing…

3 Likes