Json string not deserialising to dict

Im on alpha04c and I cannot get my dataset to deserialise to a dictionary.

I’ve included the input data I’m using and the dump messages output something which looks sane, however when it gets to the deserialise stage I just get a dict with one key “cols”.

mlp.zip (20.0 KB)

Don’t know if this is relevant here, but I think there’s a cap on size for dict.view (where above a certain size it just does…nothing, rather than tell you about it), so if you’re using that to see it that may be part of the problem.

Indeed it is not deserialising. Is it our json or one of the many Max dict limitation is a good question. My gut feeling is that it is the latter with the length of your keys, but @weefuzzy will know more than me: he is an old convert.

Just passing through, but a sensible first test is to check whether the same data works with read (i.e. going to file and back again), rather than through messages. Whatever comes out of dump is going to be subject to Max’s 32767 atom cap on message length, which could easily produce incomplete JSON once the structure gets bigger.

1 Like

it indeed does not work if I take the output of the dump and put it in another dataset via prepend load - the zl 32767 len gives me 1, but if I put the output in [text] it crashes max. so that is definitely a max size issue… which brings a bigger design question for us.

@weefuzzy do you know a way to know the size of a single item like what is out of that object?

@jamesbradbury this patch shows you the error in a narrowed down bug report friendly style:


----------begin_max5_patcher----------
530.3ocwVFriaBCDF9L7Tf7YZBXvPSO2WgdpppxAll0ofAYa1lUq19rW7XHh
saBIBQ2dAiGF6uw+Ly.O66Q12bBzjfOE70.Oum887PSVCdCy8H07SEUbM5Fo
FzZ9AfD5dlANYP6EU.WEFn.dYv1unAkdaKe6mA8OMMsaqqZ2xk7pMG0Mxw0J
6pa5LUfA243AqsJPCRC2HZjeWAEFWzkDw1DEFjRSrCQCWB91vpzlmp.LPF2c
2VadpEba.gb14dvB4HW5HWto3Ag7vqXF4XhCzTDMkNAqnDY1r+3Gh2Qr1dw2
2dI7NkSI7q9E+F07GUchxMkbCWCleGHjsclME7B9k0tj40NJcmMxyh2sTsaO
WdfDZiqFtwdyUDy3qKlmCB78XbZzrhY9JJl8pRKHKC5i8xEk6Qoo1fkkmux4
dyJWLGSD2GimUsxVQ0R0G7PPYWc6k0J5spSoXbSWbt1LoW2tVk4jsrcypWrk
nWWqy20kpakVwnSZu79zRixXSXlLuLk7Oti1xZlkjgmfzj+mMybsuRY3Pb77
MynS0QzCRkP92e2EwXs+ZwU2zoJFC8gOzzejOCpDzFgDUooNQcNcwWf2Moj2
MR1MgdKRr0fD6dNSYqAor6gT9aH4RP3ssO1+STCdiP5qyN1nrSyCwoBoaJVw
PTvihQ+YnEtpOu0zmz1obkDmxRItk1TBJYm.Sy8sjew+OzKxYAH
-----------end_max5_patcher-----------

even better. How to reproduce:

  1. take the helpfile, tab json
  2. create a 400 entries, 10dim dataset
  3. fluid.dataset~: Parse error

Plot thickening. Im on 04c and I cannot get SC dataset to load the input json file ‘anal.json’. It complains that its invalid.

(
~json = File.readAllString(
    thisProcess.nowExecutingPath.dirname +/+ "anal.json"
).parseYAML;
)

(
fork{
    ~input = FluidDataSet.new(s, \input);
    s.sync;
    ~input.load(~json); 
}
)

I assume my SC code is wrong.

Try using:
~input = FluidDataSet.new(s, \input);
~input.read(“file.json”);

Yep, got it now. I think this warrants another thread to clarify if its wrong or needs to be pinpointed in the docs somewhere.

https://discourse.flucoma.org/t/loading-a-dict-which-from-json-should-be-the-same-as-reading-the-file-directly/600

a workaround I found now for Max: write the dict to file, and load in dataset. or the other way round. it works both ways if the dict is rightly formatted. a bit of a pain but allows a quick gaffatape à la @weefuzzy to help @jamesbradbury forward.

under the hood, this is what we have to do in SC, but I’m happy to discover it works in Max’s dicts too.