I trained a MLP Classifier that helps me identify different gestures when hitting a bell. It works quite well, even if I haven’t yet learnt how to fit properly ( using validation ).
I’m wondering now, how to identify when a gesture ( here, a way of hitting the bell ) should be ignored, because it should not belong to any class I created.
I search the forum, and found this topic where the use of kdtree is suggested :
But I have difficulties understanding the way of implementing it, the general idea.
My question is how one can prevent gestures from being considered belonging to a class, when it should belong to no class at all.
As @tremblap said, there isn’t really a way of training a neural network to identify a “negative case”. A neural network learns to positively identify elements of classes by finding “boundary lines” between the classes, so to have one learn that something is “not class A or class B” it would need a bunch of examples of elements that could be positively identified as “not class A or class B”. You would need to provide examples of what it should identify as “not class A or class B”, at which point a better name for that would be “class C”. This could be a useful strategy if, say your class “C” was “silence”, it could learn to positively identify silence (although we have fluid.loudness~ which does it better!).
What you could do instead (sorry if I’m just paraphrasing the other post) is have a bunch of examples of class A and class B in a dataset and fit that dataset to a KDTree. Then with your new data point (that needs classifying) use the KDTree to find what point in the dataset it is closest to. If it is closest to a point in the dataset that is class A, it’s probably class A, if it’s closest to a point in the dataset that is class B it’s probably class B (commonly you might use the k nearest points (more than 1) to make a decision).
UNLESS the data point that needs classifying is “really” “far” away from that closest point.
Maybe the distance to the nearest point is 0.1, and maybe that means it’s close and “similar” to the point in the dataset, so it is likely the same class as that point it is close to.
But maybe it’s distance to the closest point in the dataset is 10.1 and that means it’s actually so far away from the data in the dataset that it isn’t similar to class A or B. The actually distances of what is “close” and “far” will depend on your data.
You can get distances from the KDTree using knearestdist.