Here are some thoughts after having a look at the beta00 CLI tools on linux:
-
usage info , ie with a “–help” flag, would be really good.
-
output of indices/data to text could be useful. (eg bufampslice , at least on linux, in what I tested) output to a wav/riff headered file, but I am not aware of anything that would take that as an input so need another step to parse out the sample positions/whatever data it contains to a list of numbers that can be used in some other processing. If I am misunderstanding the benefits of this then please point me in the right direction…
-
argument validation / sanity check, eg if for some reason I am inclined to run “… -maxfftsize 9999999999” then the output is 2000 characters of stack trace, it does detail the problem but maybe not immediately clear, there may be some examples not quite as ridiculous as the above command.
Regarding the first point, actually that does exist as “-h”, but I didn’t originally notice it.
Indeed, but we plan to have something a little more fleshed out in the future. In the meantime I hope the provided website is helpful?
as for the other proposals, indeed they make sense (and a CSV out is indeed in the short-term improvement desires from @groma)
Is that going to be all platforms?
everything we do is cross-platform. we try to establish a community of practice across CCEs and OSs - this is why all terms and functionalities and results are the same everywhere (including most 'unexpected behaviours )
CSV/json/binary (something like Python’s pickle) output would be really nice. Right now I am using Python as a wrapper to do Max stuff around the CLI so I can execute an instance on each each which is fun, but if there was some more native data types to spill to that would be really great and would make its appeal much wider to a number of users I am sure. Working programatically in an agnostic way has been a real joy compared to writing sluggish batch processes in Max so I’m quite invested in how these things will develop
I just mention because there was great resistance to suggestions that differed from the everything-is-a-buffer paradigm in other threads. So this is the first I hear of having native options for what the output type is.
Don’t worry, it won’t be possible to do csv in max
Buffers do not exist in the CLI so everything is a file at the moment, which is stranger than buffers~ in the CCEs… but the same way we consider doing utilities in Max SC and Pd, there might be a utility in the terminal… but all ideas of data transfer and structures are very much on our mind all the time. You talk about resistance, let’s just say that we think about this a lot… so ‘resistance’ is probably not the right word…
Yes absolutely, the documentation is clear, usually the first thing I run on a program though is ‘command --help’ so was a bit surprised when it didn’t work but then “-h” does fine.
CSV sounds good, maybe json/serialised array could be good for certain things (eg mfcc). At the moment I am too using python as a wrapper which also made me think (down the line when the source is available) it would be interesting to look at having these tools as an actual python module.
1 Like
Buffer output in a cli app is more bizarre than in the other environments, I think. At first I thought was a bug whereby just wav headers were being output, then I realised they were dumps of the buffers with the indices in them.
In one way I kind of like the everything as a buffer thing, although I can’t justify why I like it, but then there would need to be another cli tool to convert to/from data types I think (which is probably more straightforward in the other environments)
@groma made the request for the same use case - Python batch processing. And @jamesbradbury uses it that way too. We have to think about this carefully, also in line with the 2nd toolbox, but keep sending suggestions and use. At the moment @groma made a workaround somehow but I cannot remember what it was. I’m sure he will jump in when he is back from holidays!
Can I point you to a snippet which has saved me a lot of time batch processing? I hope that this doesn’t come off as condescending when you in fact are possibly and probably more experienced in the language than I am.
with mp.Pool() as pool:
for i, _ in enumerate(
pool.imap_unordered(analyse, range(num_jobs))
, 1):
sys.stderr.write('\rdone {0:%}'.format(i/num_jobs))
This is runs an instance of the CL tool on each core of your machine where analyse
is a function that receives an iterable, in this case just a list of indexes that punch through an associative list containing file names. I see incredible gains doing this and its really sped up my workflow.
P.S I am doing: import multiprocessing as mp
2 Likes
This will be useful, thanks. I’ve just started looking in the last couple of days and not even considered multicore stuff, does make me wonder do any of the fluid implementations eg in Max take advantage of multiple cores?
it will be multithreaded at one point. There is a working prototype for Max at the moment, but it is not trivial, as you can imagine, as these are CPU heavy processes yet users have CCE-use expectations…
1 Like