How ordered is an iterative sound?

Thanks to @tremblap and @a.harker for insight and guidance here. I wanted to try and measure how ordered an iterative sounding sound is and my intuition was to look at the spacing of slice indices. To make it more robust to time it was suggested that I divide the indices by the mean and to take the stddev of the delta between indices. It works fairly well on toy samples - I need to try it out on my own corpus now.

This patch is visually inspired by @tremblap, and apologies in advance as I’ve used some framelib to do the list processing.

<pre><code>
----------begin_max5_patcher----------
5330.3oc6c0sbiiiq95zOEpRMWLyt8jg+So8p9T041ySvrakRIVNQ6XK6xVN
alcqMO6GIQJaKGIa3XHY0cPM0jzQ9GR.B.R7ABf+yWt41GV7Zx5aC9aA+dvM
27e9xM2T8nxGbi+uu414wu93r30UusayR9WKd3ed6WcuTdxq4UOdYvjzWlmD
mU+JoSpddw68W475GlsYdZ1rj7puJ9tGtXS9gOcYb9iOml8z8qRdL2MA4QV1
cruFD59kQU9Sg3NVv+v+gbeO4+4xD2m31a29RUeeIq7Tlmzt41ooyRdIY05z
EY6M52ba7xk683a16iTxO9mKp9hB+51Gkl4dDe6iVk7RZ8mWt8owqJnp7BRZ
ypp43suZT2t6qYwjjUYaRq9lbO7+9k5oT0ZPV77j0Kiez8gKWppe4c7Ioojw
Xip9oVV9KsjuiMUrD+zrEO9GIUKQr5GtXYRVZ1xUIqSxxiy8S8su7jjowalk
e+zEY4qS+2US.dAyusWepeB15KVRBUy9+mUowy1N+eZU5jEYkShFKDkOtd3J
DAzkTS4O2QLUuir3ks7gKj9J3Jc7hqKHxMqeHdU45zCypFAQ8KluXwrluz1O
2rjo49WdYZV1Abw7EK69EWk9zyG4y9vhhWb9w9tqdk02uIy8p2WHRje+53WZ
xsyimMyqm17q+03rz4w4I4otk.Aa6KljEWPnOu9wUKlMqA85dkWZ4UlTHh+X
x+JcR9yUCz9BCEu8zk0BQ2tcUdR5SIqya9r73mV27Iqy+SGSeuGs4AuJ784I
yWNqfJpdC+uNIqf+u3WCr2tGotyr19Ju6adqwyOvLmyRxVMyCLns+yaylVK1
0X68JsYayVIYqBqTWkr5etULuxBvh4yKzMavXplZYSRdcOaFkVM9R8+3qePN
PCC8MM1Oc1cEV6Smj7VGLHYmLHQmLH9wYP5JVBWUwlzlCM82o4+AiiTt4WW7
CwGPf4D7iPaE+fUwVz5qA+nhJ5fh0cRwrOJEaD2oKzPXGQAo68++35O61nYV
ZVWFUpltkud6rs0K1r5w54jm8DzbpWXwLOMa61t+9VAmCdeKVMwcJFFnUx9Z
hI6dhw6kIl3LlXbfh4m6bPBbNvcuuVEl9x9Gp61xstmbu6XJ2GmmuJ8gM4Ng
r8Ok5YscZwY6dHdleyxsG1pkca+xtIW0Ocrn9xI.FRNAnipNXanwRNAPNAPN
APNAPNAPNAPNAPNA78lS.W9wI+w3X0iVeN5Iuz975DPDR9.vsVmO.JxG.xG.
xG.xG.xG.xG.xG.xG.JP.Tf.n.ALl8AHDGe.Bcl5BMbxE.xE.xE.xE.xE.xE
.xE.xE.JL.TX.nv.LlcAvhiK.J2wZC0gjK.jK.jK.jK.jK.jK.jK.jK.TT.n
n.PQAXL6BfAGW.LtDB1Yvi7.f7.f7.f7.f7.f7.f7.f7.fBB.EDfweP.1cDr
hidN6wEyVTrglWfs3nLwSRKkPa6cU8StarY2IsFkk+0x+UnRxUk+qhebvw7N
3SKp+zhHFOxT8oYQgLk6eU7H9o9Jtnug3rm7myxt2gE1+crb0hkKVs8vm2Ii
9gvkohCsuN9oj24yTAWPpB4xVbYRJZOEpEmecTR316jEIpbIP.xuI7bLrXyw
hi0F+5asRk3UsnjNpzcRAWThFVpbc9jhiD2NYJvlLYNmfY8EYdDQVAKTpZkJ
kHIxpcmyajJxpPq1FDNlEY0XSlWUQVqVzJUZPRjkKFyhrVzREuQsHaH1jYeK
xdDrTmNsURLBWRLxQaRBLUBLUBLUBL09.L0iCbVwL9sfjrIcgcVnFezDEpp7
HQXbl20iP3DSy1r9s.dmbEU+wUjNthc7wUlkj8T9ycByZHCefV8LEdTGG345
yTJzlmtpXef2BXAptXM1dPKh6Tbj7JVSzHTKZ8yoSyeK3W6TMR2erEerJbYr
13hsTd3x2B9s4oqmWN8CV+7pzr+nSdjpuBmizE3B96O840N9EhHF9QvvQzCd
DL9ZeDCTQTODETOCRZwNHn8RPbpjQfEk.sdnuJWmyjSM3WmqPvwWobeqdK7S
UmhB17ndF2SyCvKV0y39YdX0mkPSustnAOOZgy02ZV5yS3cX0rzfkni9LeQE
aGbMICWHuIv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0Hv0
Hv097.tFx2zaBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbMBbM
BbMBbsev.WK40kqB9oTdveMXUb1jEy+Y1e+uWklpL1uzJlaXU1ALtZHgzclP
9UHgXyVrZdqjnD26rWnym.kgfUjfUjfUjfUbvgUbYxpGJ0fd6rqsZeXnhhXk
0VMgyKFKa742+zUENOsq3VfQEY7DrDmcNIi2d073JyRJdeoy2LuPqnCNxGA4
CwIvST3KJFtc..h7wzYKhKbEuDzh7gi8jlcb1iDe1iSGpl6vGwbm7fYk+GhE
qRIDcImcEt.JPze0++CEa4kpiW+y+zTQvuF7SS4+RWpV8P4M0C4J2yiXsauo
iSGA.9sVNZz.yT4ULUwuD7aE+RF7s0OFOKdU4YR6LTQptMqKuLy5BVj6W1uq
3yGC2Wt.eXe86AVcHeW4sRn+dE1WtBeTe87GSzQXPe+01I5yhNqBZkusT2OP
zWyB9XfU.kSLFZpEJ4PCV4YInvGiMiDUeW8lEPlD87bfCu5M2KyAt37pTy8y
j.p4D0O5.W2UI3zwxBVmLOM4kjrYKxd5qAEaAl9Xx5FO8tzIu90fGhydpUvd
UXUfYcf8pCcaZ+9a.R+fm8Kw636SmsIcxcOrYpXV557eW9OZoXWt+6H3aE+q
oIqdG6pUFExEuUanCTHtZvK3mddP77kqmUHt7Vv2JNi8yqRV+7hYSJNddweO
c5tGvB9VAAW8Vc24p.cwip.TqfzB1lg+kp2OtXi6jY51XgmSSKRdZVnwTWFt
qXnliwBcx+6AO.p7TmKY+rjE7WBTJNiU4TVZguY+kxnK0p7jEIEu5.I4tiDb
irujmJLWluHqUZYaUhcYb4sOKOY08NXn26H+XUzi8VVT0+7Dq4XtLmG7Pviy
RhW0JS3bpirhSSnBkGcG.FHpktcSNbI4zfVkeULjH2Z4W2kpTDUga9Qo1Rj8
Z.vGNj5l+cZvtRzcCZkeF5p.LbIbaRp3vWYq+M1D8xjj+3sFa.1JCn8ndK+n
FqT9lODO5zb.mEaLoY2t9Mo5.YYH+ak1kHInaUgt3fqAGw+s6VgtortC+uRg
aVEQg+mB+OE9eJ7+T3+ov+Sg+mB+OE9eJ7+T3+ov+Sg+mB+OE9eJ7+T3+ov+
Sg+mB+OE9ebfuUia6Ljfukfukfukfukfukfukfukfukfukfukfukfukfukfu
kfukfukfukfukfukfukfu8Cj8Vaxdetas6YGOysTFjRfD+8zUG57f0x6qDH4
DLi2yJ.xHvJSZ7HdWmBapqeJro.mBaMkZZkMEh685tNA1duuY+vl.apHjRfM
OK7pm.aGQzSCVz6TBdZFtQjxK3wCi9zH3o4HI34YgeWm4jZrRYYA6pm4jaq4
78YlS5WzGsYNo1fTFV4IzQdlSpsHQt0xui3LmTGhTlSVuzN1xbxik0j5HjxZ
R+BsOqIiTWyjl7zILoAqLCVxz6mrnR8fjvjcar1vG.i09iFNZMVaDHs55IzZ
i0rnQowZiDWx85art5DW6Wry2+vWsVyyMHkQvEtJ5ZRP9f60eG35r2mxnQZe
pZo5w19T6vGnUx2fzNUdxu1j86im3PtU0NZ9HaVgzgwDbYiJafJ7Zlc+sUgC
JcOucabbr3ARmYsHWKMPKFCrfl.i0ACHDWFf26ZEaLv..P9Q3R9d+oE1Ag76
BU4hIgjwYhVoXINPHD5.+r3vKh1uVb8NhT2kuXu6AXShDossC8clMlti1dX+
SjqymLI4k1oRMxToyoi9qerbD4UgYuxLWChjgi7pxoSNRkWQpcVq73CORkWE
HSkWQ4UdjY2YG1mHYbbjWGwRqLjVGMiYgUFCWhrukU6LaoJ6xosPfhHElDXj
6WREkqTTtRQ4JEkqTCdtRULiK73KaRmcEcc+0p3c00PyHr0eWPkaV+Vm2q9p
FoZewUbQsvXGebE2UR3sNYJL7ShLOSgG0wgct9LkBs4okw14s.VfpKVis+5T
7RW5VGMB0hV+b5z72B90NUiz8GawmOPF93isTdzx2B9s4oqmWN8CV+7pzr+n
SdTOXpw45fzCfgTcM3QGKUbp5+3XmKN99CO6HsG9uaxEmp9uL1IiimAIsGgC
MdRFmyoG0qG7178YL4TCda9N775A48ViZODb6FudF2SyCvKVg8Z1fX0mkPSu
st.uG02Bmqu0rzmmv6vpYoAKQGw+zVHh5DZMIp.cSXqQXqQXqQXqQXqQXqQX
qQXqQXqQXqQXqQXqQXqQXqQXqQXqQXqQXq84AaMApW5YBaMBaMBaMBaMBaMB
aMBaMBaMBaMBaMBaMBaMBaMBaMBaMBaMBaMBasevvVKOnsxAi3rJEFGqTGoT
tx3EWrMqkgTum9PY8Z9hmdZVRajif0+UrLgbaSZgAsjkgcMNZR7iEtIGHZkG
bN4hNC.c5qFe58b0AQLemE+m0EslVoFAV0pk5UM20qLDPNYuN8orR7E6kRNW
6piHUbOD95wTg5n9JpNxMCf5HmMpUG4VjTGqoyAPcrKkQzpaTaWybpgF6nTY
Dqx.njEd82ajuE7z9TYzCi5HUYjGJPRYrlNG.kw5ZZWqzCGIExsqa9jOPbUU
HOrb90YobiGxvxhTXULO8kxOQz0rR1UUwR+oTdvesQcKsrhk1V8JkaQ5P7ZW
0cSZ70yP0fW1Z5nIeyMLTysFpGeSw+mh+OE+epGeS83apGeS83apGeS83apG
ee4LUpGeS83apGeS83apGeS83apGeS836and78mud7cMn86Zq06Ciems0ZtA
oZnuGi25tZslc06p0BvsV384TsxirnhCdcmENh+YoyByMg3zYg8bvwViENLT
vXmpiBW3AJNpZ0ALx2.jjrAugBysCPOprt0iLRaQkbKR2QJOcNtamvbKRcnx
Zg2wa2DlaU3zkFqWXGaMowic0FrZbZQi9k45dIr4Z1fFqo2fustbiufJq0sR
7H0gvM9lznO4rBupMowtCuuLD0z6mBuOEdeJ79T38ov6Sg2mBuOEdeJ79T38
ov6Sg2mBuOEdeJ79T38ov6Sg2mBuOEdeJ793fdqEydJNAdKAdKAdKAdKAdKA
dKAdKAdKAdKAdKAdKAdKAdKAdKAdKAdKAdKAdKAdKAdKAd6GH2rb0Y0cYlk6
uOddYIQJur7WNWcn2wUSekrHmfGzjC.f9QJYYL6mVZ7nnqdZowAmVZ6DSZiA
ILnd6sqSJMazmljRSXwIoz7bvwSRoUm3C6DfZOmdDbbKiidF.2dMy2gZh+3j
NKDWR2mKdL0Ufz61XCTKMGyNCSgYbFqMyXLeZLyv33XlwbsR80NSCznAnt+W
StCQRftY9CIskpm1PjnyiXHofBcAfxEHdP4OeuUEbgkPycjtbm+QTu1oyrIZ
vjiGoIyrEoB5r46fTY1hzodBG6IxbGors3iswynKMl6HEl6H6skeLqR9DX1x
ulIvLjjW1pwQp12QI8zM2HGjCz1so4An8I38earZa1fTclvSl0FmYQiQiy5H
TI1qu0YWqDnS0VcHpv7IcATOzzi85jycaHsEmsgpkeGa6C0YWzRaPZmH89lj
CUWyshbD6o1LRqvQKNR0XyHq4h1Lp5C0LzxNNxgA9vyWdev4jMB6RWw8Qw1E
ui2w8gOVVHikTfvXY.LRhHEBiDmChn3Xv.UJ.iEGighy.QU3HVHgPUXLTQPH
JQDJKUZ.iEFiTHHIcLzopptNmbrBQXjrfnIIFzjAhftEEtGDpxfgrmFhYBo1
hxXEt+8WnywBCKs1PPjkZ+KyvtqPh7B0m4PnSocvnypgpE5jeYzoAjriACcO
oQCZrvP1IBhZto49CaYo5AXnUsOzx9enkFd6iMa.F6lcC7cisX.F6C1NCIMH
PicU0FsswV0+VNJiwQaFNBGfgV19PG0+CstCyk5AvTckcNN1JX.2NLr8wd.D
zjMuQmXobCZrqp3osM11gXrYsO1lKarg3SqECm+pTSEPLifxHwOE+T1zj57z
IKWjlk6g7HrJcsjNveMQa+iKcpAwudKJmqkCfITKOeYiDCxHowflXP2N3hQ5
AhS8VLPO.zQiMnfdk.5NZnLRmRh.Eu2zPVmzFrFINjQ5R0mzfN4AFn7ng.df
NDqQ5zlj4n4.H+jGi.EPQLRPGUCuwhCZr3W7XI.MVn.1CH6Racq+BGKP3pih
41COjbWxg1icLDNOJbuChv4gVrNJhDj1uDECMRP5+xZarW3XY.I6hRTFzFvt
Kc4iEHfsz3.1KH8DMN7PP3pnPQ+WAJXMJTVuTvbmUgxXASWFm3EBRlWgS.uf
3EiTpNpMTCugMTCCOanJXxtnrehhAZ+DAF1PkQfFKT.rWFAF1mKerfEEDTjc
kfzSjnXWSBZuAINqWfBliDk8gjf1GRfh9kDjtr.k0KArKM.J6CI.IyKP4r7B
P6CIPwGEAHYdAJ6uJ.IyKjnPWf7GRfh8PAL8KFFzEGz5EGk0KNn0KNJqWbPq
WnbApjvtWXbTVuXfnKFGkwBjsWFF142dA8NwXwPYr.gODGG5BlbHF6eI.YiR
H38QrvOmAG83i5F7ScxXAOBEtLCDghhzCOBzXczflITZQUUNj6qOxReFib3e
y3tKmszF4pJhb0cWNE.5RKxs8hL4YL33KSxCAIShhlOGzkflGhxXoAMVliJS
x0VuLoKQd3gbuLYy+tVlrLgevSlDzcqN7n3WHjbaCspBsvFZU0+cMEHTJznf
CmacQAnb8vC4fjiUnMVmltD8g0hyYvQ2ZA2FA55lfxEgt5BFH.c0VvYrNYLF
OHBPGnqoY6euOTgncwO1xOOAmPiBm.zEfwpQ3j6Uc2a.ikAE5R.9RZc4iEHU
TKJo.gE1s6LBkTS.zkMCkH6t8a4DiEJZ+fhr6Vp+BS5BPik.mD7.jrACk8eA
gdWUAUCgwBx4H2lbPWXZmA5LLnjlEaWINw506sQ4x5xC58KkCyA87kC52Kuu
Wuzced4vd7RU4jrqd6xW9ue4+GPLFFH.
-----------end_max5_patcher-----------
</code></pre>

Is fl.stddev~ a new object? I’m getting a newobj: fl.stddev~: No such object error message, though other fl.objects~ appear to load just fine.

It might be worth pinging @tedmoore on this as from some discussion on Slack he was taking zero crossings as a way to determine something not too dissimilar to this.

I’m on one of the later builds where I think the names have changed. It should be fl.standarddeviation~ in old speak

I am officially pinging @tedmoore to reveal his secrets.

I don’t think I have insight here. I was using zero crossings as an analysis parameter and putting that into my vector as a crude (or different) estimation of pitch. I found it useful in classifying no input mixer sounds which can be quite noisy which can upset the pitch detection algorithms.

What do you mean by

?

I thought that given the kinds of sounds @jamesbradbury is (planning on?) using here (electromagnetic pickup recordings) that it might be useful to think about things in a similar way to the no-input analysis. I guess in your (@tedmoore) sense, it’s more about delineating pitch rather than (slower, rhythmic) periodicity.

That being said, there may be something to that.

1 Like

That’s true, I was using it for real-time analysis so I wasn’t so concerned with repetition through time, as I was looking at the most recent analysis frame and doing classification.

However I see your point about the sounds being similar and therefore zero crossing may be interesting for @jamesbradbury to investigate.

Another boring-ish option would be to investigate beat-tracking algorithms. As that’s kind of what you’re getting after, though for the opposite reasons.

So something that beat matches the clicks, and however much it has to correct in order to do so, would literally be the number that you’re after.

1 Like

I got good results implementing the patch above in Python and playing with my whole dataset. What I really want to use it for is to concatenate samples and see how stable+iterative the result is. Given some constrains like

  1. Minimum duration
  2. Maximum duration
  3. Min Repeats (if repeat)
  4. Max Repeats (if repeat
  5. Minimum diversity between repeats

and then sorting generations that satisfy these constrains I think that I could start getting into some cool way of navigating simple transient/clicky material :slight_smile: The next level after that is pulling apart the sounds with and rearranging them even more, then testing again so you can see how getting this part right is fundamental to the overall thing working.

It’s probably too late for this part of the process, but you probably could have a done a lot to mitigate some of this analysis-based stuff with metadata/tagging of samples of the recordings as you made them. Tagging things with “dynamic” or “static” in terms of the kind of material you were gathering. So like having the mic on the laptop while it was just sitting there, vs putting it on your phone while you swipe through Tinder.

There probably isn’t a concrete relationship between the behavior and sonic results, but it’d be a vector that probably carries some meaning nonetheless.

How equally spaced onsets are regardless of duration.

So a file that is 30 minutes long with 2 impulses 15 minutes apart would be more stable than something that is 1 second long with 200 impulses all in the first 200 samples.

The only problem is I have 4000 segments to deal with (no way am I going to audition them) and I didn’t know what they outcome would be of the slicing till I’d sliced it so metadata tagging is not really viable. In part some of the samples stayed true to their original form, but extracting certain parts changes the morphology of a segment quite drastically from something that varies to something that is very stable for example.

I think that’s a bit of a misunderstanding of the idea - “If you’d labelled these manually you wouldn’t have to search automatically” is sort of turning the idea of “Finding things in stuff” on it
s head. Interesting things might happen (or not) when the computer listens, rather than a person.

2 Likes

That’s true, though I guess I meant to add to the pool of stuff that the “computer listening to” can draw from. So in that example (active vs “inactive”) computer behavior might be a surface to find a non-obvious correlation via algorithm. It may (or may not) have some perceptual relevance (i.e. busier when “active”), but there might be something else going on too.

@weefuzzy is the expert on computers listening wrong, but I think there are lots of other categories “computers listening faster”, “computer listening differently”, “computer listening in a useful manner” - one goal is to model your way of categorising and in one scenario the question is “does the computer do the same as me” but that’s only one (meta) category of useful/interesting “listening” that a computer might perform.

1 Like

Yeah I can see that.

I was thinking, after my last post, about what the “domain” or restrictions to the listening apparatus are. (specifically in the case of @jamesbradbury’s ftis stuff).

Like, is it temporal and experiential? (taking a single sample at a time)
Does it understand context and/or "meta"data (e.g. something as simple as the duration of a file, which isn’t really an experiential thing, as such)?
Does it understand metadata (tags, etc…)?

Can easily get quite philosophical, to be clear, in an interesting way.

Yes - in general machine listening tends to be less contextual and contingent, and more like a production listening that is obsessively focused, and sometimes clearly diverges from human perception, but that’s partly because I don’t think we have very good models for “listening” as a general task.

1 Like

I would add that I think we can creatively (ab)use these algorithmic limits by reflecting on this contextual and contingent disappointment we get in our description process. I often learn about what I am looking for a lot more by being confronted with the rigorous answers provided, however wrong they are…