Kinetic energy also called (Information Energy) for random vectors (features) is basicaly the analogous of kinetic energy from physics in probability.Some people say it is an entropy just like Shannon entropy for measuring bits of information to determine uncertainty. It is also an entropy , but the correct way to think about it is to think at it as 1/2∗m∗v2 of random vector.
It was discovered by Octav Onicescu and it is described ad simple sum of squared probabilities. For a trivial example if X is a random vector with corresponding probabilities p1,....pNp1,....pNthe kinetic energy ( also called information energy) is equal to sum(Pi2).sum(Pi2).
Simmilar to physiscs only that here there is no (1/2)∗mass(1/2)∗mass because probably this probs are floating things and because they are floating either theyr mass sums to 11 which is not very plausible , or their mass is 00if you think of them as floating inertial bodies with no mass .
Proably quantum things explain better this , explanation must be updated. For example if a random variable X=1,1,1,3,5,3X=1,1,1,3,5,3 the kinetic energy is computed as following :
There are 3 categories in the random vector 1,3,51,3,5and the probabilities of each categori are :
Notice also :
This means that kinetic energy is bounded between (0 and 1].
Having a 'new thing' that measures random vectors properties will try to make an improved knn for image recognition .