FluidBufCompose new model

What am I messing up here? I thought the new codebase would wait to be done before getting to “write file” portion of the code. I am loading a lot of really big files, so maybe that is the problem?

Routine{
	server.sync;
	buffers.do{|chan, i|
		chan.do{|buf, i2|
			("chan:"+i+"buffer"+i2).postln;
			FluidBufCompose.process(server, buffers[i][i2], 0, -1, 0, -1, 1, finalBuf, i*chunkSize, chan, 0).wait;
			server.sync;
		};
	};
	"write file".postln;
	server.sync;
	finalBuf.query;
	finalBuf.duration.postln; finalBuf.numChannels.postln;
	if((finalBuf.duration*finalBuf.numChannels)>5000){
		"w64".postln;
		finalBuf.write(folder.fullPath++folder.folderName++".w64", "w64", "int24");
	}{
		finalBuf.write(folder.fullPath++folder.folderName++".wav", "wav", "int24");
	}
}.play

@weefuzzy will be able to confirm but I think that the wait and the server.sync are doing the same thing so either one would be good but both are not needed.

I am trying to find an error in your code but apart from that it should work. The size of the files shouldn’t be an issue, except if the loading is not stalling the routine… maybe the iterator of the buffer, what is the class of buffers so I can try it?

did you try processBlocking instead of process? For bufcompose it should not make any difference (because we always make it blocking) but it is a good reflex to think of what spawn new thread or not.

also, if that is a large task, I presume your server becomes ‘yellow’ in the bottom right ? It should if you are doing significantly large task on the server thread itself.

if you let me know the class of your collection buffers and the length of the files, I’ll try to reproduce. I have my large 28chan piece which should be a good use case (that is, if SC supports that channel count without crashing, unlike Max :slight_smile:

I’ve done 142 in SC. Sorry could pass up the chance to talk up SC. Gotta fight the battles, you know.

1 Like

@spluta, I’ll have a look. Just to confirm, it doesn’t appear to be waiting properly before hitting "writefile".postln ? It certainly should, and I’m not sure you should need any of those syncs under the new regime.

This is painfully slow (although I have debug builds of both our stuff and SC), but seems to be waiting as expected

(
~buffers = 10.collect{ 
 Buffer.alloc(s,44100 * 120,4)
}
)

(
Routine{
    var chan = 4; 
    var finalBuf = Buffer.alloc(s,44100 * 120 * 10,4); 
    var chunkSize = 120 * 44100; 
    ~buffers.do{|buf, i|    
		chan.do{|chan, i2|
            ("buffer:"+i+"chan"+i2).postln;
            FluidBufCompose.process(s, ~buffers[i], 0, -1, i2, 1, 1, finalBuf, i*chunkSize, chan, 0).wait;
		};
	};
	"write file".postln;
}.play
)

FWIW, because FluidBufCompose doesn’t have a threaded version (process === processBlocking as @tremblap points out), then a single sync before “write file” could replace all the waits, because the jobs all line up in the scsynth command queue in the normal way.

User Error. I was doing something stupid.

FWIW - SC can handle files up to 1024 channels, but no more. This seems to be a limit of libsndfile:

https://github.com/libsndfile/libsndfile/issues/78

But this is something that should change. w64 files should be able to handle 65536 channels.

Sam

In Max, there is sfplay~ which was supposed to support many channel count, but they hardcoded the file size in there so it crashes on anything larger than 2 gig since Max5. i sent a bug report for 6 and for 7. I have not tried in 8… I’ll try that now just for fun :slight_smile:

edit and it still crashes in Max8! a segmentation fault… I’ll try the same file with the mc…

edit2 now that is funny: mc.sfplay~ also crashes!

I know this file plays in the Beast SC extension since it was premiered on there with that player, but I’ll try with the native one now.

edit3 DiskIn with proper buffer size works seamlessly. I remember Scott having improved it a decade ago, and now it rocks :slight_smile: Now I better go finish that piece and write those unit testers instead of moaning against other people’s bugs :smiley:

1 Like