Researching and introducing kinetic energies

Kinetic energy also called (Information Energy) for random vectors (features) is basicaly the analogous of kinetic energy from physics in probability.Some people say it is an entropy just like Shannon entropy for measuring bits of information to determine uncertainty. It is also an entropy , but the correct way to think about it is to think at it as 1/2∗m∗v2 of random vector.

It was discovered by Octav Onicescu and it is described ad simple sum of squared probabilities. For a trivial example if X is a random vector with corresponding probabilities p1,….pNp1,….pNthe kinetic energy ( also called information energy) is equal to sum(Pi2).sum(Pi2).

Simmilar to physiscs only that here there is no (1/2)∗mass(1/2)∗mass because probably this probs are floating things and because they are floating either theyr mass sums to 11 which is not very plausible , or their mass is 00if you think of them as floating inertial bodies with no mass .

Proably quantum things explain better this , explanation must be updated. For example if a random variable X=1,1,1,3,5,3X=1,1,1,3,5,3 the kinetic energy is computed as following :

There are 3 categories in the random vector 1,3,51,3,5and the probabilities of each categori are :

  • Prob(1)=3/cardinality(X)=3/6=1/2Prob(1)=3/cardinality(X)=3/6=1/2
  • Prob(3)=2/6=1/3Prob(3)=2/6=1/3
  • Prov(5)=1/6Prov(5)=1/6


Notice also :

  • IF X=1,1,1,1,1,1,….1,1,1,1 KineticEnergy(X)=Sum(prob(1)2)=1. Meaning in this case that where there is no diversity and everything is perfect certain the kinetic energy is at maximum value 11. Notice in this case is something realy strange. If you make an analogy with the atomic nuclei , this example with maximum kinetic energy is simmilar with the one in which the atomic nuclei come very close one from other , resulting in releasing large ammounts of energy as in our toy example, phenomenon called nuclear fusion.
  • IF X=1,2,3,4,5,.. KineticEnergy(X)=0. Meaning that at maximum diversity and highest uncertainty the kinetic energy is at very low value near to zero. Notice in this case probably resulted from previous 1 , the categories from the random vector could be interpreted as atomic nuclei resulted from expansion of the previous ones with high energy resulting in large number of atoms with low energy in the end, and we could think of that as nuclear fission.

This means that kinetic energy is bounded between (0 and 1].

Having a ‘new thing’ that measures random vectors properties will try to make an improved knn for image recognition .