This is another spectacular property of the exponential distribution, and also the first time an explicit formula is obtained for the variance of the range, besides the uniform distribution. It has important consequences, and the result is also useful in applications.
Theorem
The range R(n) associated with n independent random variables with an exponential distribution of parameter l satisfies
Before proving the theorem, note that the first formula is well known, only the second one is new. The standard proof for the expectation is not considered simple: it is based on computing the expectation for the maximum (see here) and the fact that the minimum also has an exponential distribution with known expectation (see here). Our proof is simpler and also covers the variance.
Proof
The general distribution of the range is known for any distribution, see here. The range is defined as
In the case of the exponential distribution, the range computed on n random variables has the following density (see here page 3):
With a simple change of variable, the k-th moment of the range is equal to
Using WolframAlpha (see here and here) one obtains
Thus,
The two symbols H(n-1) and ψ_{1}(n) represent the harmonic numbers and the Trigamma function, respectively. To complete the proof, use the fact that
∎
There are a number of interesting consequences to this result. First, the expectation of the range grows indefinitely and is asymptotically equal to log(n). Also, the variance of the range grows slowly and eventually converges to p^2 / 6. This is in contrast to the uniform distribution: its range is bounded, and its variance tends to zero as fast as 1 / n^2, see section 2.3 in this article.
This result is pretty deep. It is almost like the range, for the exponential distribution, is made up of a weighted sum of independent exponential variables with same parameter λ, with the k-th term added into the sum contributing with a weight equal to 1 / k.
But perhaps most importantly, we found the two extreme cases to a new statistical theorem (see here, section 1) stating that the length of any confidence interval attached to an estimator is asymptotically equal to A / n^B, with B between 0 and 1. This length is usually proportional to the standard deviation of the estimator in question. In practice, in almost all cases, B = 1/2. However, here we have:
For normal variables, Var[Range] = O(1/n) and E[Range] = O(SQRT(log n)), thus B = 1/2 (see here and here.) These results are summarized in the table below:
Order of magnitude for the expectation and Stdev of the range
Finally, the same technique could be used to compute higher moments, or to compute the variance of the range for other probability distributions. It could also help with studying the convergence of the re-scaled range and its associated Hurst exponent, see section 6.1 in this article for details.
To not miss this type of content in the future, subscribe to our newsletter. For related articles from the same author, click here or visit www.VincentGranville.com. Follow me on on LinkedIn, or visit my old web page here.
Comment
Joe Blitzstein (teaching probability at Harvard University) pointed out (see here) that my theorem is a particular case of a general result that applies to exponential distributions, known as the Renyi representation. This general result is illustrated in the picture below and in this document.
This also brings something very interesting: since my proof relies on the fact that the sum of the inverse of the squares is Pi^2/6 and since Renyi’s argument is entirely probabilistic, it is thus possible to prove, using probabilistic arguments alone, that the sum of the inverse of the squares is Pi^2/6. I will look at higher moments to see if there are some other facts about mathematical constants or integrals, that can be proved (thanks Renyi!) using probabilistic arguments alone. With some chance, I might even discover a new relationship.
Finally, another way to prove the result is to use the fact (see here) that
© 2020 Data Science Central ® Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
DSC Podcast
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
DSC Podcast
Most popular articles
You need to be a member of Data Science Central to add comments!
Join Data Science Central