LTE - An alternative to LPT for people who don't care about "pitch"

It oughtn’t be that tricky to get something to converge (useful results is different). Always (always) start small, with low learning rates, and embrace tweaking the LR (to try and find a sweetish spot between not-converging and converging-but-too-slowly) as part of the model design. Be wary of making the network bigger, unless you’re sure you have enough training data to absorb the extra complexity (and can live with the extra computation).

I think UMAP’s had it since Alpha 07.

No, it shouldn’t screw up convergence in and of itself: if the whole range of your feature is huge, then that might give odd results, but in general points outside -1:1 are fine (consider that the network is learning weights to apply to the inputs anyway, so these weights can just get smaller, within reason).

It doesn’t screw up the KD tree either. The tree doesn’t really care about the range of data in absolute terms, but the euclidean distance it uses implicitly assumes that features are comparably scaled (euclidean distance = sqrt of sum of squared distances of each feature for the two points being compared).

I think some of this will partly be down to it being a new way of working (and there still being unsanded UX edges to the toolkit, natch): to a very large extent ‘programming’ with ML stuff is about data monkeying as much / more than it’s about models. I’ve certainly seen the argument put forward that it constitutes a completely different programming paradigm.

1 Like

I need to give this a good college try I think as, at best, I’ve left it running for hours with some generic-ish settings and that went nowhere. Obviously not an ideal way to go, though I did use a slightly modified version of @tutschku’s looping patch from the last plenary as a starting off point. I just haven’t gotten my head around all the parameters and how to massage them in a way that works.

(Much like the dance of “pick the correct descriptors with your fleshy human brain so I can tell you you chose wrongly”, it feels like picking parameters for this stuff is pressing a button that says “change random numbers” and then the computer buzzes you and says “wrong, try again”, over and over until it, somehow, magically works. (Surely unsupervised parameter selection for supervised machine learning is a thing?!)).

Ah right. Forgot about that (one’s ability to transform point). I’ll compare and update the speed comparison thread accordingly.

All good to know. I remember running to some issues with that (see point1 about lack of convergence) when we first got these as I had no idea what ramifications of the @activation param were (ala fluid.mds~'s zesty @distancemetric).

I can totally see that. I don’t want to derail a very useful conversation here, but I’d say 80% of my faff/friction at the moment has to do with getting the specific numbers I want, in the places that I want them. So I’m hardly even at the point of legitimate confusion, even though I’m quite obviously confused a lot (this thread included).

But indeed, the paradigm of moving around and transforming huge fields of numbers, with every tiny thing mattering in a way that is (often) unintelligible by humans, is pretty hard to decipher. Particularly with how “it depends” things can be.

Actually, is this a thing?

For example, a NN could expose the individual node weights and multipliers, but it would be ridiculous to try and set those manually. So is it possible to just zoom out the gradient descent thing so that the initial parameters for the algorithm itself are randomized and iterated until convergence happens? I suppose that can be a very slow process if it picks some horrible starting params, but with short enough iterations and changes, it could presumably evolve to something more useful all on its own.

If it is, I would love that…

Just have an object called fluid.stuff~, and you send it a message makeitwork, and 2 days later you come back to a converged network, and somehow 0.3333 of a Bitcoin too…

It’s kind of a thing, yes. See, e.g. 3.2. Tuning the hyper-parameters of an estimator — scikit-learn 0.24.1 documentation

But, still, fleshy brain is remains responsible for gathering the good quality data to feed such a beasty. Often, the time and computation investment to do this sort of thing is overkill, and a disciplined manual approach can get you somewhere useful quickly enough, once the various moving parts make more sense. In this post, and the one after I pointed Alex towards some meatier guidelines and explanations. Bottom line though is to start with hidden set to the smallest network you can get away with (i.e for an autoencoder, one layer the size of the reduced space you desire), and a minuscule learning rate (like 0.00001 minuscule), and don’t worry overly about the other parameters until later.

2 Likes

The idea was to keep a sort of perceptually equivalent scale, so 1dB is 1semitone of pitch is 1 semitone of centroid. Simple to understand the relations in a musician space, but timbre is super limited there… trying to shrink an MFCC space to 1D of 100points (the range in dB and semitone of useful loud and pitch) is still on the table, via kmeans first. This will happen when I get the headspace which is soon I hope

Oh yeah, I remember that discussion. I don’t think it happened on the forum at all, so I’ll make a thread for it now.

With regards to the scaling, at the end of each my processing chains (except Timbre) I’m ending up in “natural” units (e.g. linear amplitude or MIDI cents), which I can presumably scale one to the size of the other, and those would be alright. I can obviously properly normalize (or robust scale) things afterwards if I want the entire range being used, but I guess that’s more an aesthetic choice.

I’ve been thinking about this again over the last few days, in light of some of the info from @weefuzzy in this thread and some of the comments from @tremblap during the Thursday geek out sessions.

I’m thinking of abandoning the E(nvelope) part altogether, since with the short time frame it isn’t massively descriptive. That being said, some of the clustering from it was alright, since it relied heavily on a mixed collection of means of derivatives. So those may be useful to keep, but perhaps moving them over to their hierarchical descriptor types.

What I’m also thinking about now is incorporating more vanilla spectral descriptors alongside the MFCCs, as well as lower order MFCCs, to create a more comprehensive T(imbre) space. I’ve done a tiny bit of testing with this, but manually assembling variations of descriptors/stats takes me a long time, so it’s a bit discouraging to code for an hour and see bad results, then code again for an hour and see bad results, etc…

I’m also rethinking trying to “balance” the amount of descriptors per archetype. So Timbre is potentially over represented with the amount of spectral moments and MFCCs available, so reducing that down is definitely worthwhile, or eventually doing some of that k-means clustering-as-descriptor thing that @tremblap has talked about. But Loudness, and much more with Pitch, doesn’t really have that many dimensions that make sense. With my short time frames, I could potentially forgo summary stats for Loudness and just take each frame, potentially alongside std/min/max and derivatives, so the loudness is as comprehensively represented as timbre.

For pitch, however, there’s only really one value that matters…pitch. Confidence is useful for forking or conditional matching (separate conversation), but as a raw descriptor, it’s perhaps better suited to describe timbre.

So unless loudness and timbre can get boiled down to a single number, and even then, it seems like a lot of information and detail is getting thrown out, it will be hard to have each aspect equally represented.

For 80% of my purposes pitch will largely be irrelevant since I don’t have too many pitched elements in the input sounds I’m using. There sometimes are, and when they are, I would like them considered, but that can be handled in a different way (biasing etc…).

Towards that final point, is it a viable thing to distort the space such that you have (as an example) 10d of loudness stuff, 10d of timbre stuff, and 1d of pitch, but the pitch descriptor is scaled up 10x such that it impacts the overall distance more. Does that just skew everything around in a different way than if you had 10d of pitch information?

IIRC @balintlaczko had done that in his patch. If not him, it might be @tedmoore. If neither, I’ll look further, I have notes somewhere of someone doing an even larger timbral space…

yes it is different. think in 2D. 2 x 1 cm is 2cm if they go in the same dim, but if they are orthogonal it is the diagonal so √2 - you can try it dirty now to test if that works 2 ways, and I’m devising examples for scaling with fluid.normalize and json you will love and hate.

Yeah I’ve done this. Lately, I’ve been using FluidSpectralShape + FluidPitch + FluidMFCC (often between 9 and 13 MFCCs). This all usually ends up in a MLP maybe even through PCA first, so it all kind of gets put in the wash anyway.

1 Like

Hey there! I am working on something similar now, but I cannot really confirm yet if it works. My dataset consists mostly of short sounds (short scratching gestures on various objects), and here is my list of descriptors that I am trying to use now:

  • length
  • loudness stats
  • attack loudness (first 100 ms)
  • attack strength (attack loudness / mean total loudness)
  • spectral shape stats
  • grain density (length / graincount)
  • spectral grain density (length / spectral graincount)
  • transient density (length / transientcount)
  • tonal strength (mean loudness of harm / mean loudness of perc - hpss)

The idea with the “grain density” stuff is to use the ampslice, noveltyslice and transientslice to get an idea of the grittiness, or granularity of the sound (and maybe some vague spectral morphology). It might be BS, I have to test and see. :slight_smile: When I have the feature set I UMAP it down to 3 dimensions, at the moment it looks like this. There will be some spatial granular synthesis involved, that’s why I wanted the 3D.
I could never get a good intuition on using MFCCs so far, so I am kind of avoiding it… :slight_smile: I tried earlier to use them as general descriptors, but it always turned out that I could have had the same result (at least on my dataset) with a lot less and more targeted descriptors (like loudness, centroid, or flatness). But then again, maybe a high-res MFCC + UMAP combo would make a lot of sense for a “general purpose” application.

1 Like

That’s a cool idea. I was using time centroid before to give me the overall “longness” of a sample with good effect, but this is another interesting way to get at a similar idea.

Hmm, this is interesting. Do you mean you have like a really sensitive transient detector and are using that like a zero-crossing thing?

It would be cool to have something like that natively, though I guess the idea is to have spectral flatness and/or pitch confidence act in similar way.

The video is great. Would love to hear what the sounds are if you have anything like that in context (or even just browsing the 3D projection).

Almost, but maybe on a slightly more “macro” level. I have some sounds in the dataset which are more granular than others, (like a slowly rotating rattle) and I wanted to distinguish between these and the “smooth” ones. There are also some objects that produced some strong “clicky” transients 2-4 times in them, that’s what I want to listen for with the transient slicer. And when it comes to longer gestures, there are ones that are just one continuous “note”, and ones that have several slightly different such “notes” in a similar time period, but not necessarily with strong attacks. These are the ones I want to catch with the novelty slicer, though this is the part I am most unsure about at the moment.

Thanks! It is really just for visualization, and I am not even sure it clustered the sounds meaningfully, it is a work in progress… I’ll make some examples with sounds as soon as there are some results! :slight_smile:

2 Likes

This UMAP shape is really interesting. I don’t think I’ve ever seen that kind of convex 3D U shape before! Do you happen to remember the settings?

Yes, it is quite similar to the help file, numneighbours=5, mindist=0.2, learnrate=0.2, iterations=50. I actually swapped the standardization with robustscaling (following @tremblap’s advice) and it looks quite different now (see video link below), there are much less completely isolated pockets than before.

I made a little test with sounds, and I think the clustering works quite well. Here is an example. Flucoma is great!

2 Likes

Interesting. To me the pockets are sometimes novel and where things of uniqueness get sent by umap, but also a smoother space can be useful. I need to try more robust scaling!

That looks and sounds great!

Is your visualization stuff in Max/jitter? I’ve been trying to make a navigable plot viewer thing but have struggled when going into full 3d.

1 Like

Whoa. Great work!

1 Like

Yes! The point cloud is a single [jit.gl.mesh] that gets the vertices from a matrix that I fill from the dataset (actually that dataset-to-matrix thing is a good idea for an abstraction). That [jit.gl.mesh] lives in a [jit.world] of course, and there is a [jit.gl.camera] that I animate with a [jit.anim.drive]. Here is the part of the patch that does it:

<pre><code>
----------begin_max5_patcher----------
7927.3oc68j0iaiblO6.j+CD5Y6F0AKdjmbbbFGf0dxfwNXPvf.AJopkoaJR
sjT8QFr929VG79rnDIa0xpGLcSqRjr9tOqp9i+7e5UKVE7HMZg1eQ620d0q9
C1m7Jwmw+jWk9AuZwNmGW64DI9hK9la7M2tOZ6A2EuNY7aC7iu0YMkONH8Cc
2H95Aq91affru5tfMhuFN8C16Du9qt9aWFRWGKmIH6a.uVCYXx+iEf+arwM.
s+Sw2Wj6+U7ffnaxdk9G1456QiESTXgOM3Pb5GiJ9P7c1IdHK9qgtNdYSR4W
O9o8T4DZwhWy9es+iX3+u+7eh+W1ed8jhyzgK5BGYfE3EaqKFjjO8AFfm87i
oOJf0EwZqzxeskXqr6BEAgBTDTxNoSDLUnhnnAgJpAuqb72dpvLGHWcHNNvO
CRXPgDtaFjMx93vfC9anhwL.4z2FvDVRdErt.E.R+cNhfOMVG3EDJ+9fav51
.zqa3BX+3OO5Vm0OU5QlMl6V+.17xyc8ck978NgLtrXZ3RpuyJuxREkIH314
M0dWf2lL7ypsE.IHGnA2XZYgvD9U5l.hkUE.pYdZ1+KnI+mZrlFKlQJu9QS4
g1F2PtR4GSJO43o7qC1sixdp0dldANaz1GvdiQMyAf6TcmNr.sVZ7rLsVE0c
fwGrBoQzXs0L5UnSyvEpK3h.zkZuDV5Lsma3pMyR66Dj5z3stokzds3OD6iy
xTY4dZXBPjBELwDWO58zvHWlNlBOlWsvY+9Be9qJdSbf+aRQGqWm+Yt9kEm4
Jgn26l8nK74NgLvMlAqGBkBnOZjo4JwAvPeliP7aK4S4ziz4l.wyEti1m3cj
f.kMdAcaDnjW.IruiEp5P1lEvibUAdAquSpqDj8oA6o9t964bl9wNwIPQ93a
n25bvKdYK9PU9KT0Ktxi1r2SuZw1P2MA97IRYZC+ySekLgZgWK7eW.lDeEem
8Mc6LtOFBpsQiX.6gnUNgbRWhxVT1nwAAdkGK+N8n2FmL9dWe+pHz3f8cLZn
61u10cuJfM5tNe7hghVdvWN7RFWR7xHm6qf4ic77RDXq7Fdzw2cmSLM1UROP
f7QkVd9Zz5v.Ouxvsbn6aZnMLAf0zGb2D+UwKqD6A6Fb2mxXsHmruwcKMJtx
GF6rMpxGEE+jjDT7yNrJQTeYLc2dOFzT4avDcbihi9ZvCQIeyT1uRnh7P9JI
5WTUX4A5TkXY0hQb0h6ChJMZAUiVkFnoHaDJEQ5BwYCPMcisqerU09uJW0ep
5lIA12RYxWwgLa3QtBZeK3.SEwA.CQvclmFN.WbnV8yIMJl4BWwc1QFAUy3H
cidQRPoCORinVmFRB1IRRLQmYDzF20wZGbE+4szcqnazfshrPJhrzMjgDCU.
agZGao2I1hOkYb+NgOkwbIYvJdWabhcJfhJhid0h6nOsyYewgKMNSQm.z2Eb
OU6MPMfFnH7ye54i2zv+27gAhGPkw+eKNd8gen3v7GPkwiJONbQgQKv5TxwG
w8wrmsYozD9RtlDWVDhRk0MhlDbEU4d6NjppeCFZde12ozDazY1YtzE4rk1L
2N257RQjBswiSf8wiCwREBPgQCCqSiGuaMBys1.dFD43na1D5xXqd6A2kdLS
8T+NTJX2KBiHcuTGI+C4zzgh52Pyy.Vaq2Mx.0znOx7Lwig7XBV6CXpS0dKO
3.uff65.KZoHaWhGKHoJ1oBKxfnka8VxgPd7USDNULQaCiXaqJFQvVgAo+tC
7A3DsLW76jlHhRtFKfA+MzGKF243gv3JqyStuzDE6ijdDumR2zJ6kpd4H8ED
R.SoeNSAuTWnFpSD81C9qaC6npOfx7obggbjZoZC0.UV4N9GNbS+dCijRSll
Wf3lTaZsgcvJhcPOypbJjwKOW+VyYg..3egVvaQAGBWm91Rh9lY5ubvHznXW
+rDA964l9YtuO.p2fmKbmZ0.CYxTBYFtQl1W3DNE4tQNro3DNYvmQyEzYzbA
dFMWzUdtXO8yEz4zjwP4IC5bRyicGZd.S3Tz3Lh1Yp5bY5kurUVgbJMtUCrx
+QdVn5KCT+wQVfisdAqb7pTRhFK.R1L5nqWaklpnXfqVpTTVrz4G8ze2WQYG
PCL.6tWhlfdmZulSzxcNL53iMhQLTnAp3tCRXwXAOtxT2NPyykQxbK+Q9iZw
r0KFlOQWzS.FHx0hYesX1WKl8yUwreSG4EtunnwLYLBuuTrE8dm72ST0HjMo
1nmkgJszUYbiTys1ZpmWjlSHUC7lUNQsmoydS6.lXHajGApBlXGFb9T5+xdV
TigfzaF4zEoahHT91jGF8.e8UWuy9RZyKGSBaCSkBGAsbePTY2SpgVAphVSp
hEQepq+R5Dd9qmEyHOyL3FsHZLWtC1pvlMPQ0Sjjj7AvuzRAZmLZAt9Z5s1B
R1pp6VFR.Anfta8WJ3laCC1E8ztUAdsheT11FPjZXS3KMdmt0jagUD9QIP9O
dZx4s3yMr.iBasxcpJigk8sqt4kj9G9RVxsUTikhVzfX6jd1hLcFzRWESSjK
j81gaINBbXmydVX46ZEoYnJ+DPFkIx7YuE2N514B7RnctNVG8zvZ25E3DiQs
FofkpNunmrhlvWtt7UZ0OVy4XrhpRPIoWxReNzk7b4V7F2V0ef6ueILE3Fno
LamjWZ1i5r8JQ5.6VYh5uGjHlEPMJ0O.unZrxgFSJxPQlItlIK.AZRLsfHxE
dzogr4CUSju21ZZIkc8Q1WWFmXep+7zQpcJI1E5gf5muRDPJNuwJuTjCu06f
6la3qJfHZ72yjEc7cZkYBpF1xBeCtjLnt8TtfZRMBJbx44PL7fe2o+fXppqz
FBkXFmX38D0V+QS35Pp61YFqFaD7GmtYVNuZEgobhtsfJix5JQ9iC5XpZmPt
ZHMjJsiAbF5CLKU6MDiYnY8TtEiPFmOckl0LzHX.UmLbYsodxndOUNSSFkZN
WK7zOYHJiY.Se+5obmLBgS+jQ81gSedlLXkEswS+jAo7jAM8SFnxSF3jNYLT
tcNMldMvBqeJgYZxH1rz28FJaJWGeFgu3YSp52bd5VX0aKVyIGeYorQUio20
GK06H+oetnLUxd5soZqreX3YnstUlkgf5ostK23csEoSdybdXiavmEcS4xOQ
8OTHtpzt+rEfX01ac87x1r0ZrpVog2tP90JU3pR67bHacHzRriyAwlHCwUrK
HjpJTRtQX5cpSR2q5zMPl5xc4M918FleEno6Ek+VAPa46BXaAzkWw9HX4cHt
jdRzeaRilZVnEIkD7vf8AgYse6MX6x24g3fsgNabShds8py85hLQgrudCsOI
OhcFYwszSpYFwxj4ckxp8XRdy.tNHTXSCcSn.EaoigBjsdS34SgFUgoZxHuy
NA78RhkFDx2sC09k.dxibUmbVrAv4c2+MUzgL0y6Oo+wSaxVKsSb8qJStA2X
QJ+LJ1R6+docRxYGwbxTSvOHHnRK78q7PEQQCE0jtVO98p9q2Gd6pcAksKzG
qxbyk37n4vU0bkdeV4GvGOr1QchG0ixqXRIrgokggovMYK.zxRdkNAfP0wFK
hXOAQ2tU9Qf.F1Bu0gFlHoi1V..rAxXYJgtIAJ2HlQ1XKa4UDH6o0vclsWRu
z0muJ+nYzSrMBZHu2Tm+QlbloFdJiE+6IvGlw5akEghUx9PM+plm3U4+MH.I
J2f+io7J4ia9BUoCMZ+y8TesO63Go8Y5N2URifkIDkvi1.cnos.CXSzMMkrg
HaIqIwn7M6rdcU9XLm3K2ixSQM7.F0SYklCoQtuIPPoMf7w07qhd1I1kyy+E
hLafOr3hMYje5+hVhoqo6U33ef58FVPUW8d5p2SsvhDGOgx1yCH7nFDo8qzs
G7bBWLOulgDPxUohyJeLqRHKSJm3W0U9lKH9lYm24J+ykF+C5J47xgbdkXdA
QLwWIlWNDS84yL8Uk.WL7Mn4z8NTYNmo8kgmuWk9U4gKC4gqJ1tPHjW8r4Bg
PNad0neMsCWL7LW0ieoPJIWIjWFDRiIUO9OEDp8Q26opyszRyC1Qeo.twf2Y
634ohe+1W9oSqIgzMfPfrW5sQF5BfhXCsLs5qEef792W1CGV.FBQb9qm1I+U
uyJUD05bt4c5qeSN6WmAJz8qBVTQecgfDCIzxaTIAUztyle8e7zlvfsT+uHH
Gc04OPCLS0n3sXpaXambEnod.qg99g.XZVEs5hAACkOG1zEMm88iX6Rvcs5x
XkEtr4RFutwqTuK3J0yO8HYYeRBVss3cx6qMrsorY1v115vjqL4KAmIzdmUl
515WMUMhhBseUijhBBJef5Su2oaQjAfPaRDoWFrYTF4cU2tUUWRAJwoVYrHf
jqlaX38EOvCdQBAevw0+6Go5Jblc75WM6vQzCtLIP0gjZRGvDYz7e8b..nWt
Pv+Sjm6lRa9uGi+kCwDXyMg9fdD8f95PUbcc6yHt9izMmFd1nSKS00T8bAne
RrwEtNN7jzy9rM646+teeFkHpvidNog9mC1PilQTwQ6RXStHOm3ogpF87Qqz
OeX2pgLyaR+ccOile33WbWGOKFylSXheTfE+x1Y0OuNXO8X8VMeWfvFy+oMa
eUTgdNQC+hypSigbPpAOeQCzGiWchQO1gedMD27ypV0uDrkmghShv2RQG.cR
yOm7d3esaP6AMmOQE8aN2SuMHb22OMyhcKABtgbtSG23Dd2a74GShuQjqL0Q
GYI57XVsrWqe2XlOS8LqFlY0wJqhVVBlwd0kZSPIYtjvPjfDGJlYKIea0pic
GCw9JG1K8kt42RN5PAfoqKUyeGnimSqpVcOpSnpq0cvzVZmA7luHUElusKT6
htKyyeckGk45pbKQ3cAA2MaL8t92chIMjgnAVY9BiM.LqtJmWVUu6i5lp4jf
xusZxT4Mpf3dvEuBNIBUCbN+rXu34q4CLjzhre2WD6Cx8vFbPoKpvbHndmL0
S+F+Ll+EUDOdbu6GZf4MpmIymRRlIkTl7dS4MvVOY+Cx1FaYlbko0b5d4NmG
M3GqFCOFmg0LGkf8Jj8ZwD1.qtU4TAYlnGXVwSUOGVFTN8PXBxtruGbge3LC
C0ObmlJ58HGpgU1dFU8q5y9goEe2z80Md0ymabi+9qaCByGESRsCPmAWWmBa
owyc.yxyVp24cfNjEzPS6NWPyjl7KWKucQc6yHzDF7f+IBN8ZpZ9.m+1SNmJ
z.P1DYO8ZZhwHo5.Dj4Q7rCMeHjROYvQPWxp1Czf6Z3rCI+JcyIBGlYt0CyB
1FjX5alAl+M0yK3gSX+1EcCfXC.V86aBhHahWSKnkH5F1MRx1VCmcPF8iAHK
Jg+GB7NYd1rj6RXtWRjsicQS7yAzr+P3duwvsLyrs5YcHgoFQnXjqMgnRDI5
.oSpHBvvVHGirHFHqYDSDRWScueHMfSS7mYwSmiE.E2qQmEHQFd7GFzVqYYa
15FFxplwrsYlrCixDyrw8SLOKXqin9ahdQaYLJdUkjuOgaMpO26F6wN6GFjV
Xx9Qm3.sOxSrxrMYiho2dvyKdfscP4h7oCY94K7xOKTOdZ0ZbYhUcY.YQHPY
sEMvlX45lyfomYN63nTzP7Hz7ECJcViFZDYYZYfkmWNox4XjM1dNM6DSidou
Io9.OOrOcpcuAfSHEoHAmP+X1RRpDt5KsqArWeWAegMTaCiFZaCLrRLrqmzt
eF.fbop18J868tgwOo822FLhppReR4OlEQN2S2rj4XNCMuzINNzc0gX4AZUQ
ZA+bxZcn69zjCk+tVrwcKmur7Gt0KXkiWRUayn94iG6rUdPHuHaFItH4uIbQ
x2dgyT4joS4yR4xIIuz4nbVXFMdFJCEGx6PawYvMRWbFbqWfx174mLGsy.nX
Z3RpuyJItGT3VZ3rnt0yg5iFn4jpCYGvDh+Yxg7dfSrK+7QuN5.S5DcnSDmh
zXAh.R.0NSxaAenJHeJfa4Si87Sh8u4FeyVua1Qi9pF8Qmcr.Tzd6lPmGVtK
XCUaefKS.g8Iz8weMgboAzdKOUlKEhpQLMiuU70Vxku0PrA8iY9z35vFpIrn
NTAlJjgMmohXzGVztYrHpUrHChWtSrNNJdVteL3zxViKwnn.DlvmLyhMt9wS
EazCAgdaxYhtMxwgyaPCchnRdEFmi7+d6s6YLG55Mg6HVpf7LrD+AZebBYXE
YODZYd8zH5swccrFyn.GHOryYueP3tFYl5VIL1VptQHu.Ql8fPPMiPzaEgvm
lLiVNgOIPDo+e9iWUKfzcqnaJxz1MKcwQYXm8Yie5F7hq0x1K9pqeYSvkn.Y
mdDKBCN3uQBFF4k+oIBikTOFQXaTpEqXrf06QDbpe4Uu.1O2sGcqy5mJ8HyF
ycKiyht1yc8cGq9jLwkh9XULz0zwqWB3h0iSOIiMk.n5l4RXwjpppI1nu3no
6qC1sqPIHKJJx97HFWtVHcyAA+dS7AHaEjDYRfbAQnQcRtJZl.itdl8Rfhtj
YxF+9FgKUT4J03ZqOAd0T5kSCqo43VWOJOxiDemgYZHXpEJ74kC9ggb9ljSz
504elqeYoCtHM8d2rGcgO2IbMOXq0wGBk76OZTXiZaA2+nP+CtEOjsKDdgri
64xJQ6SBhbQw51tHGAqKDTzIBU2V.odCSRwPjXBVdAquqp9yf8TeW+87ESke
b1Yva93IwFtrXzW7B.z3WHMV2lGsrTeg.UBc2D3ymHkIM7OuP.xDoQpRvj3q
36ruoaWZNosQiDmDvqbB4TtDUWn7XiBB7JOV9c5QuMNY78t99UQnwA66XzPd
lu5X7UArQ204iWLTzxC9xgWxXRXNNyrfV4K534kHOW4M7niuKy+DZrqjdf.4
iJ0i+UlOEAddkga4P22zPaX7+qoO3tI9qx1VnHCvfBZMOnzxGuyU+rCqRjzW
FSYNKxflJeC9ZUIJN5qAODUID3RnhVO2ny0TVMeHkzXVuQYJn17VWlqLN9Q7
kKSpaZKc7cVFFrJ6ey8aqzsWPqp4hZI3nrhUYDhL2fkgOWORm18dqCMrcG6X
klMo4rMoH5qrAmJXOuCtatgid9t1aYSzLKrQZXs2x6xUeFFjxyLn1aYZYCEZ
uhX3Ds2xf2MLxubL185SYxcqBNvh0jzFxFqHx1hoBp3O7ZlPPFJf5gsi5wch
58b4YcLIyBEClXLoEkSmQ4TZjgsaC4g.8g8fx3vsjo1v3DwWvprLKyz9.QvR
MU07vI2E1Km4rMzmthXOHv9GOzWhvba3Nhh3NxOfntRp9ZU1EpHFDegJ61uU
nMNwNQz3uqjUanUe3SbhXrztMjXOkFORS.0jZ7nZSOWlAyPMSqR+WvY+tCrA
3X8hIIo+kqPoLl9J0ugAB9anOVL3rwCeUq2YKiv52wOYndHnkxnrNXf.iC5n
PfY75v0lq0YEpqsBNxTVkVyzLmKzp.Zr3Jhc8yBX82y7gq5Wbm6FYIHRUkIy
xIK3Xc6B+vWLTj1FZHBLCG5fS.zo2NzoOuPmgpPmo7KNgyEN9CoxTAM8yE8I
fnabicaD8ZCMo.GYRjWsaWdcNANyABbspr7GqFA.QTolsjWf8AP6YMmWONGO
dMsuMHTiEDydOmmZD4nqRWR..xBzYeMA5icBzSBNDH37HHgWU5Xz0DneMA5W
Sf9yVBz2bXWqQYi5OpRjHaBr+HDtw1WRYGm2SE2vXgcVemFO6D+k1yticuoi
vV11YH9J530Z11iX9cd1QTgrY.UqaNIfhXHjMlaA13D4jPcifd8YUVtVVroh
pi5PpJDZfjc5I9EettlBDY+ofEgSSQF+GTZ0mPWF3zgWGUNJc.LpPEqtGlun
Fz0MvoHXaNu6EUgUkZDKhTO9p5kHcCsQ2XA.VXRJZiu1V.lWl5IyBq66hJMq
8FH6u7ywqVktsUj6iojDqCzQonQ9hovF7huzxiC2HTwxHviSF7hk8q6JofTA
ET0.QSFMl1prb9WgErpNESzEZ1zSSZNecregWuEnx4rOs1ESX5V45NULq8fI
exndknPoE93rnxQBj3TOYrUF2fm9YCRYTyLPmF.aybPoDHG0DolCYJ0KQHJk
r1uBQQRuZSi3080tq6qcW2W6ttu18Bces6xrzwssbpyVJwMtRf6ptoHPxRMx
P1lcyvxoV9jK6NdB3W0vXJRnd2SXwLxkuYQzleK5E88tQDtxuQb4F1nk2nXd
MRuvY98YanxKTur+zKBB2HqDNXDlB5CYFLBuOHvVE1HLZjei8Bi1i2aDoDRM
adMFuQkjTPiGma5ipmWn93IpfThHZLyxlPPKxlvwXJXo1TnYoyDc9UZDFwKq
ZCvTs4WZpwW5noWp0vKId.TIZmDvOucQNrwM3yh90X4mxO1op3Vyejs.rq46
Zl6Csry+W0oQcHTd.ZvO4AjtOx86mTTeTUObK3yT19cuN2IUbkifhicW+uE2
S6v0z1bKsfWNuNkSqQW3p39VYtyxjmctONBjkZgRbDmrBGM983hbPcxxjf7e
uDYyD936BAZ+R.2CO2tIEU14tx8sejmSeR+iCehTxe3ZaGVcQV3aZj42euax
fiOvdTXevEBPm0B.+XPqUAbaZqyqS7vUcj0RLPWmJYiMU04QS0DguRmlDaYe
7vZmtQ7MsadZZYXXJOdg.PKK4U5D.p59FTyaDjlHfgbObDZXhjNqYA.vJjhJ
4C1j.wIGpMXqji2FBj8jpbWsl+UTVxvyOvFQMbnHcR7ZGIeCbvmbSU3SUcaw
bTcosEMF+y8zjy1sOS24tRZTnwMqIvM1.cnobin1lnmrQTqirkrWDi7aroCR
i1O.9FcOr31cgfha7TmnYHE7BoTEC3rsiNHASaWqS7I8KZIpzGmGmi+Ap2aX
NXe0h+EhE+O4DGOhxMi6T6QMHR6WoaO34DtX7ej8435UNxIw2lpDgbxvH+Xu
ReOSnuSJM9Jc9bhNitRJNOHEWIDmIDB7UBw4AgPeZLAcUP6rf9hlJWLP4T3w
8AimlGq9U9wme9wqJENCHBWs7dFPDlDqt5WC46rf1dUO24.YfbkH77SDLFM8
b+TPn1Gcum1MUskFEoiZoBtwf2Ee3wMK8+1W9ogWnZcCHDH6KPajgdxgVKzp
5gGYYNCHuGDk0ejuTLD.mYZ2HV7tpTYBqykhH2WcQOq5Kxd5HIAqjnWAPPhg
Dp3EFGU9vXrZMr9GOsILXK0+KBTcaUeFZfYpaDOcScCa6jq.U6ofFp8LAvzT
IOZHIX4QVNjMEQSQsmEq3E20cy+WlwOYQD0zUp0sDstZ0pv0aOXl9lZD379e
.aaltzEEmLnxOhfmDa.4qOx5WMVEWUgR6WC8VfQ9CTe58NsyBqHBqIV3dYRl
.d32Ub29reN4jk6pUF4Ejb0TM+de91j3Y3r6CNt9ee.pAvY1upe0jMGidvkw
w28rrF2XkU37DO4Pmmyt+mHOW9ZTY394np59laHOku8dPIsnhptttI.28Q5l
gi2LZUKbco7oF.9j37+bcb3f0+L4yL91k72mPtxJ7KOGZs94fMznIDDOJ2NZ
xUqo.1UQsy7KQ+yG1spuYUS5ypaUd5li+h653ISo8TLe4a5zwmuN3740A6oC
wCm7U9mMl+SS5xqnd44.u+EmUCmAQY0DO+fW0y1XEYsZw+gFhUYVz37khms5
CfP0RRKAsRidNrv8u10653c98382btmx2.G+9vUy2MmdwsHomQb+Fmv6diOe
Cn+Mh7.zMXV6z4V0UCx0buqZdWzy2QrxxacVFrsDLOcpGxlfRxxBggr.IFEm
Hsqea0pgrBOsuxQ7buLG9VxFeD.LNcIT9yCMLNghZ+3GEqptto.ia5bU7M8h
WsR9xuq1EsmZ2+5JOJykI4Rk6cAA2M5Lit92cDIEggv.VY9agM.LKNJkGIUt
yAeC0LJpzaoFOedQ8DeebwqfiBS+.leypd14qXcFR7a1u6JJLkcsoAivcggG
SAn6jg1+a78VmyFOm83dSpRvVMJam4qCISMbJyWmojCXqmrlqsswVlIWYZME
t8TdWYrOekUuvkptkpdTampiHrWbaSUobfnxVk5HN+JuknNlzmQvkUk21SO1
s7zI2kiSeujph.jxD1RaioJmG3t15RGItN4oAw67NP6qgLaZWB.ZlzjG4Z+r
KpyaBlogAO3eDS0dUOO9S0+1SNGyLEfrIx9oxzDiQRwLDj4k0jMS+PHkdTSU
A9LK6vPCtqGS1r7WoaNh4nYlaevr.k.Ip0mnI5+l54E7v.2CoP2.H1.fU21J
QDYCQYZAsDd0xtIR1VQxjAJnW1fhnbWeHv6n3exRFEg4hBQ1NZEMCMlyz8GB
26crl+My1NwzgDlnnPwAWhjzmGn5.oSNHBvvVHqfrHFHqI.BCoqot22WAgah
eIKVlbnCTbe3YTmkxvS9PuaIMkswnaXHyPNSesYxNtCik1F2MAXVYwhn9ahN
a0zGEupPB9FwsAn4ZW0K1Ye+y9BSjO5DGn8Qdfmi9DIJld6AOuXEJ8V4DwqC
Y90I7pKyEbdZAp0Z2Ua4WKBAJy6uA1DK6gcClL5TTU6TvK9HKrnxgqOJnFjk
okAVt26lJCgQ1X6oPEaL+nC37bCA5Ad9dd5XpHIfi3EgNhSv2L8lIULRs1tV
w8wsJ3.rQ+alXPaafgUhAI8j14v..jKYi16j926FF+j1eeavIH5mreW2xgq9
e9Ow9J++.i5BEL.
-----------end_max5_patcher-----------
</code></pre>

The WASD navigation comes from the [jit.anim.drive @ui_listen 1]. But there are two “modes” for [jit.gl.camera]. If you always want to look at the center and just rotate around, then use @tripod 1 @locklook 1, but if you want to move around freely (though without a comfortable way to rotate) then @tripod 0 @locklook 0. Also consider using the @speed attribute for the [jit.anim.drive].

2 Likes

Hi again! Sadly, raycasting is not so intuitive in jitter (at least for me) - but I figured out something. Here is an updated “explore the dataset” example with better navigation, more precise query point, and prettier drawing (also with more sounds - added some percussive ones).
I also made a little example patch, @tremblap suggested that @rodrigo.constanzo might be interested. :slight_smile:

Hope that some of it could be useful for the awesome dataset plot abstraction of yours!
world-navigation-example.zip (20.2 KB)

2 Likes

woaaaaaaaaa this video is even more impressive than the demo you showed me! Thanks for sharing the vid as incentive, and the code as helper!

2 Likes