Regression + Classification = Regressification?

Well, well, well. What’s a data-based post without a little completionism.

So I wanted to go back and test stuff just using melbands (like @weefuzzy had suggested ages ago for the JIT-regressor stuff).

So using the same methodology as above (10 training hits, 1000 tries with testing data) but with some different stats. For all of these I’m using 40 melbands between 100 and 10k, with the only stats being mean and standard deviation.

Here are the results:

40mel with 1 derivative: 70.9% / 73.2% / 76.1% / 73.3% = 73.38%
40mel with 0 derivatives: 68.9% / 67.0% / 66.8% = 67.57%

The results are pretty good, though not as good as the MFCC-based results.

Based on the fact that I got pretty decent results from taking only the mean and standard deviation (rather than also taking min and max), I reran some of the earlier tests with 20MFCCs.

The results are pretty good, though not quite as good as taking more comprehensive stats.

Here are 20MFCCs with only mean and standard deviation as the stats:

20MFCCs with 1 derivative: 73.7% / 71.0% / 73.8% = 72.83%
20MFCCs with 0 derivatives: 71.7% / 71.3% / 73.0% = 72.00%

Where this starts getting interesting is that although the accuracy is lower, I’m getting pretty decent results with much less overall dimensions. For example, the last test there gives me 72% matching, using only 38 dimensions. As a point of reference, the best results I posted in my previous post was 76.2% which took 152 dimensions to achieve.

So it will be interesting to see how these shape up with some PCA applied, as it will be a balance between accuracy and speed, and the initial amount of dimensions for taking only mean/std is already at 25% the overall size, before any compression.