The Entropy is one of the most important concepts in many fields like physics, mathematics, information theory, etc.
Entropy is related to the number of states that one stochastic system can take and how this system will evolve with time, in such a way that the uncertainty will be maximized.
This will happened y two ways, first, every system will choose the configuration with a higher degree of entropy among all that are available and second, if we let the system evolve, after…Continue
Added by Pablo Gutierrez on May 21, 2020 at 6:53am — No Comments
In 1927, W. O. Kermack y A. G. McKendrick described the first mathematical model for infectious diseases using a set of differential equations. This model is called SIR because of the three states one individual can have.
These states are:
When ever we visit a client and present our proposal, we start wondering if it will be accepted or rejected by the customer. Usually, our customer will analyze our proposal, compare it with other competitors’ and make a decision.
In order to build our commercial forecast system, we need to assign a probability to every proposal we have presented and assign a numerical value to every one of them.
One way of doing this is multiplying the value of the proposal by the probability of…
Added by Pablo Gutierrez on November 26, 2019 at 3:05am — No Comments