FluCoMa integration in PlugData – is it on the roadmap?

Hi,
just wondering if there are any plans or roadmap items to make FluCoMa tools available inside PlugData in the future.

Thanks!
Emiliano

Hello,

my 2 cents:

I haven’t tried to use Flucoma in the Plug Data environement, but according to the F.A.Q

it should work ( FAQ · HonKit )

But this won’t work in plugdata as a vst plugin.

Actually there’s (should be…) a way to use it also as a vstplugin by recompiling plugdata including flucoma’s sources, the info is on github: GitHub - plugdata-team/plugdata: Pure Data as a plugin, with a new GUI

under “Adding your own externals” section

thanks, I’ll look into it more — I actually haven’t done it properly yet. I need to port an old flucoma slicer I made in Vanilla. Inside here : https://www.youtube.com/watch?v=\\\_gN4yWyQjxY

in this patch, which happens to run on PlugData: https://www.youtube.com/watch?v=seQLbhqXEnM

thanks

As far as I know, the pure data versions of the externals work just fine in plug data

Thanks Rodrigo, I have to try it as soon as possible. I hope so!

If PlugData accepts external libraries, it will work.

If PlugData accepts compiled gui extension in the dreaded TCL (classic style) these will work too.

I’m almost certain that the answers are yes and no.

Thanks for the answer but I’ve just tested it: I added the local path both in Preferences → Paths and using
[declare -lib FluidCorpusManipulation], but PlugData still doesn’t recognize the objects.

So even though the standalone version technically supports externals, FluCoMa doesn’t get picked up it behaves as if the library isn’t present at all.

It seems to be one of those cases where the Pd binary is compatible, but PlugData doesn’t bind the externals correctly (especially those that rely on deeper integration / compiled components).

emiliano

I think you declare it wrong. -lib fluid_libmanipulation is what you want.

Same steps as here but with the declare object instead:

Ah! Ok then I was clearly overthinking it :sweat_smile:
I’ve just tried again with the proper -lib fluid_libmanipulation declaration and everything loads correctly now.

Thanks for the nudge sometimes my brain decides to take the scenic route before arriving at the obvious solution!

I still have a small issue with fluid.waveform though, so I’m probably doing something else wrong somewhere along the way totally on brand for me?

no this is what I meant early by this:

I think its GUI is made in JS or something so our GUI won’t work. there are only 2 (waveform and plotter) and if you poke people there on their forum I might get motivated to co-port it with them

Thanks, i’ll try:) in any case the slicer works.

3 Likes

this tune/patch is epic! Congrats!

1 Like

Hi Pierre,
thank you so much it really means a lot coming from you.

Gesture Carrier is actually a submodule of a broader environment I’m developing, called Envion. The project works on two intertwined axes: gesture as control and the archive as material. In parallel it integrates Net-Audio, which sources micro-fragments from open repositories (Archive.org, WikiCommons, BBC SFX, etc.). The core idea is sampling without knowing: material is not selected but encountered, and composition emerges from listening rather than curation.

The Gesture Carrier part I’m refining now focuses on shaping abstract gestures via wavetable envelopes derived from real audio waveforms. I’m currently redesigning that system to make the articulation more responsive and fine-grained, so that control emerges from sound rather than being imposed on it.

Here’s a small playlist showing just a fraction of the sonic palette Envion can generate:
https://www.youtube.com/watch?v=kylLIwebgj0&list=PLLITukQh1_l61lP6GMfa1Hz4Db7_wrTTT

And this is the project page (still in progress):

The long-term goal is for the system to remain porous: analysis and gesture feed back into sampling, so the network ends up “listening” to itself as much as it listens to the outside world.

Thanks again for the encouragement it’s genuinely motivating on the research side.

grazie, emiliano

2 Likes

I forgot to mention: the main articulation system in Envion is based on text-file databases that store triplets for vline~ in Pure Data, while Gesture Carrier (as a submodule) uses wavetable-style envelopes instead.

this is seriously impressive - I’ll try it for real and will report back :slight_smile:

thanks for sharing!

2 Likes

Great! Thanks:) let me know:)

2 Likes

The GUI is JUCE

1 Like