SuperCollider Audio Query Class

This relates to @tedmoore’s # Audio Query in SuperCollider Demo example patch.

I have started to move this functionality inside a class and I’ve made the repo public here.

There are still some action items. I’ve been a little busy lately to be able to finish but I thought I’d share in case this was useful to anyone else.

1 Like

Can someone show me an example of using the channelFunc method of the FluidLoadFolder UGen?

The following code generates errors:

FluidLoadFolder(source_files_folder, channelFunc: {
  arg ...args; args.postln;
});

Error log:

[ 1, 1, 1 ]
[ 1, 1, 2 ]
[ 1, 1, 3 ]
[ 1, 1, 4 ]
[ 1, 1, 5 ]
[ 1, 1, 6 ]
[ 1, 1, 7 ]
[ 1, 1, 8 ]
[ 1, 1, 9 ]
[ 1, 1, 10 ]
[ 1, 1, 11 ]
[ 1, 1, 12 ]
[ 1, 1, 13 ]
[ 1, 1, 14 ]
[ 1, 1, 15 ]
[ 1, 1, 16 ]
[ 1, 1, 17 ]
[ 1, 1, 18 ]
Channel index out of range.
Channel index out of range.
...

What is the expected return value of channelFunc? Something seems not normal here (perhaps a bug?).

It should return an array of channel numbers to read from the file, such as could be passed to Buffer.readChannel. Its purpose is to give fine control over cases where the channel count in your source folder is variable (e.g. a mixture of mono and stereo). The Buffers populated by FluidLoadFolder will always have as many channels as whatever the ‘widest’ file in the folder is, and the default behaviour with ‘narrower’ files is just to wrap and repeat source channels until the destination is full.

The input arguments to the function are: the channel count of the current file; the maximum channel count across the folder; the index of the current file.

For the above, it looks like all your files are mono anyway, so you could just leave channelFunc as nil, or do ^[0].

It could absolutely be documented better, and perhaps designed better too :grimacing:

Made some progress with @tedmoore.

Here’s what it sounds like in action.

Still having some weird behavior with corpus directories which have a mix of mono and stereo files. My channelFunc looks like this:

{
  arg currentFile, channelCount, currentIndex;
  if(channelCount == 1, {^[0,0]});
  if(channelCount == 2, {^[0,1]});
});

I expect my channelFunc to work but I get an error like this:

AudioQuery: Initializing Class
loadCorpusFilesSync
bufnum: 2
numFrames: 546274
numChannels: 2
sampleRate: 48000.0

processCorpusSlices
ERROR: FluidBufOnsetSlice:  Invalid source buffer
1 Like

That looks more like there’s a problem with the buffer that got loaded (that server error comes from within the FluidBufOnsetSlice c++ code). You should be able to leave your channelFunc as nil if this is the behaviour you want, as that’s the default anyway.

Looking at your code, I’m wondering if it’s an asynchrony problem with

  this.loadCorpusFilesSync(path); s.sync;
  this.processCorpusSlices; s.sync;
// etc

i.e loadCorpusFiles might not have finished before you call processCorpusSlices? Do your various postln calls definitely happen in the right order? Looks to me from above that there should have been

"all files loaded"
"num channels: 2" 

posted as well?

Ok, so the first thing is that you are right about the async problem. The code runs without any error if I fork like this:

FluidLoadFolder(corpus_files_folder, channelFunc: {
  arg currentFile, channelCount, currentIndex;
  fork {
    channelCount.postln;
    if(channelCount == 1, {[0,0].yield}); s.sync;
    if(channelCount == 2, {[0,1].yield}); s.sync;
  }
}); s.sync;

The behavior I want is to always assume a stereo sample for both playback and analysis. If there’s a stereo sample available then I want it played back. If not I want to hear here two copies of the signal. Playbuf should never complain like this:
Buffer UGen channel mismatch: expected 2, yet buffer has 1 channels

I expected that if(channelCount == 1, {[0,0].yield}); would work a bit like how you might do sig!2 expect in this case it would read channel 0 twice into the load folder buffer.

In the recording you hear two test examples.

First, I load a mono kit and then playback the corpus in its entirely. Here I hear only the left channel where I expect to hear both channels. They I perform some real-time queries against a real-time input. Works as expected minus the mono thing.

Then I load a mixed (mono and stereo) kit and then playback the corpus in its entirely. Here I hear both channels for all sounds. They I perform some real-time queries against a real-time input. Something is wrong here. I’m not sure if it’s a problem with the analysis or the indexing.

I feel like I’m missing something obvious. Otherwise everything works and I have made tweaks to the synth to synchronize the query with an onset detector.

SC_220601_192250.aiff.zip (5.3 MB)

By the way, if I leave channelFunc as nil I still get the weird playback issue you hear in the recording with the bongos.

Short video example of improvements using an onset detector.