Don’t know if this is relevant here, but I think there’s a cap on size for dict.view (where above a certain size it just does…nothing, rather than tell you about it), so if you’re using that to see it that may be part of the problem.
Indeed it is not deserialising. Is it our json or one of the many Max dict limitation is a good question. My gut feeling is that it is the latter with the length of your keys, but @weefuzzy will know more than me: he is an old convert.
Just passing through, but a sensible first test is to check whether the same data works with read (i.e. going to file and back again), rather than through messages. Whatever comes out of dump is going to be subject to Max’s 32767 atom cap on message length, which could easily produce incomplete JSON once the structure gets bigger.
it indeed does not work if I take the output of the dump and put it in another dataset via prepend load - the zl 32767 len gives me 1, but if I put the output in [text] it crashes max. so that is definitely a max size issue… which brings a bigger design question for us.
@weefuzzy do you know a way to know the size of a single item like what is out of that object?
a workaround I found now for Max: write the dict to file, and load in dataset. or the other way round. it works both ways if the dict is rightly formatted. a bit of a pain but allows a quick gaffatape à la @weefuzzy to help @jamesbradbury forward.
under the hood, this is what we have to do in SC, but I’m happy to discover it works in Max’s dicts too.