Kinetic energy also called (Information Energy) for random vectors (features) is basicaly the analogous of kinetic energy from physics in probability.Some people say it is an entropy just like Shannon entropy for measuring bits of information to determine uncertainty. It is also an entropy , but the correct way to think about it is to think at it as 1/2∗m∗v2 of random vector.

It was discovered by Octav Onicescu and it is described ad simple sum of squared probabilities. For a trivial example if X is a random vector with corresponding probabilities p1,....pNp1,....pNthe kinetic energy ( also called information energy) is equal to sum(Pi2).sum(Pi2).

Simmilar to physiscs only that here there is no (1/2)∗mass(1/2)∗mass because probably this probs are floating things and because they are floating either theyr mass sums to 11 which is not very plausible , or their mass is 00if you think of them as floating inertial bodies with no mass .

Proably quantum things explain better this , explanation must be updated. For example if a random variable X=1,1,1,3,5,3X=1,1,1,3,5,3 the kinetic energy is computed as following :

There are 3 categories in the random vector 1,3,51,3,5and the probabilities of each categori are :

- Prob(1)=3/cardinality(X)=3/6=1/2Prob(1)=3/cardinality(X)=3/6=1/2
- Prob(3)=2/6=1/3Prob(3)=2/6=1/3
- Prov(5)=1/6Prov(5)=1/6

KineticEnergy(X)=′′sumofsquaredprobabilities′′=Prob(1)2+Prob(3)2+Prob(5)2=(1/2)2+(1/3)2+(1/6)2=0.3888.KineticEnergy(X)=″sumofsquaredprobabilities″=Prob(1)2+Prob(3)2+Prob(5)2=(1/2)2+(1/3)2+(1/6)2=0.3888.

Notice also :

- IF X=1,1,1,1,1,1,....1,1,1,1 KineticEnergy(X)=Sum(prob(1)2)=1. Meaning in this case that where there is no diversity and everything is perfect certain the kinetic energy is at maximum value 11. Notice in this case is something realy strange. If you make an analogy with the atomic nuclei , this example with maximum kinetic energy is simmilar with the one in which the atomic nuclei come very close one from other , resulting in releasing large ammounts of energy as in our toy example, phenomenon called nuclear fusion.
- IF X=1,2,3,4,5,.. KineticEnergy(X)=0. Meaning that at maximum diversity and highest uncertainty the kinetic energy is at very low value near to zero. Notice in this case probably resulted from previous 1 , the categories from the random vector could be interpreted as atomic nuclei resulted from expansion of the previous ones with high energy resulting in large number of atoms with low energy in the end, and we could think of that as nuclear fission.

This means that kinetic energy is bounded between (0 and 1].

Having a 'new thing' that measures random vectors properties will try to make an improved knn for image recognition .

© 2019 Data Science Central ® Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Statistics -- New Foundations, Toolbox, and Machine Learning Recipes
- Book: Classification and Regression In a Weekend - With Python
- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions

## You need to be a member of Data Science Central to add comments!

Join Data Science Central