Exponential Distribution
Definition 1
The continuous probability distribution $\exp ( \lambda)$ with the following probability density function, for $\lambda > 0$, is called an Exponential Distribution. $$ f(x) = \lambda e^{-\lambda x} \qquad , x \ge 0 $$
- Depending on the book, the parameter might be its reciprocal, $\displaystyle \theta = {{ 1 } \over { \lambda }}$.
Basic Properties
Moment Generating Function
- [1]: $$m(t) = {{ \lambda } \over { \lambda - t }} \qquad , t < \lambda$$
Mean and Variance
- [2]: If $X \sim \exp ( \lambda)$, then $$ \begin{align*} E(X) =& {{ 1 } \over { \lambda }} \\ \operatorname{Var} (X) =& {{ 1 } \over { \lambda^{2} }} \end{align*} $$
Sufficient Statistic and Maximum Likelihood Estimator
- [3]: Let $\mathbf{X} := \left( X_{1} , \cdots , X_{n} \right) \sim \exp \left( \lambda \right)$ be a given random sample.
The sufficient statistic $T$ and maximum likelihood estimator $\hat{\lambda}$ for $\lambda$ are as follows. $$ \begin{align*} T =& \sum_{k=1}^{n} X_{k} \\ \hat{\lambda} =& {{ n } \over { \sum_{k=1}^{n} X_{k} }} \end{align*} $$
Theorems
Memorylessness
- [a]: If $X \sim \exp ( \lambda ) $, then $$ P ( X \ge s + t \mid X \ge s ) = P (X \ge t) $$
Relationship with Gamma Distribution
- [b]: $$\Gamma \left(1, { 1 \over \lambda } \right) \iff \text{exp} (\lambda)$$
Generalization to Weibull Distribution
- [c]: The exponential distribution is the distribution for $k=1$ in the Weibull distribution. $$ f(x) = {{ k } \over { \theta }} \left( {{ x } \over { \theta }} \right)^{k-1} e^{-(x/\theta)^{k}} \qquad , x \ge 0 $$
Explanation
Relationship with Geometric Distribution
The exponential distribution is the distribution that follows the time until an event of interest occurs, and can also be seen as a continuous version of the geometric distribution. Thinking of the geometric distribution as a generalization for the number of occurrences, the generalization of the number of occurrences for the exponential distribution might be considered to be the gamma distribution.
Relationship with Poisson Distribution
On the other hand, while the Poisson distribution and the exponential distribution both focus on similar phenomena, they differ in that one is concerned with the number of events per unit time, and the other with the time until an event occurs. This relationship between the two distributions is why some books use the same Greek letter $\lambda$ for both. Especially considering the mean of the Poisson distribution is $\lambda$ and the mean of the exponential distribution is $\displaystyle {{ 1 } \over { \lambda }}$, the relationship between the two distributions could be perceived as a kind of ‘inverse’.
Proof
[1]
Only when $t < \lambda$ $$ \begin{align*} m(t) =& \int_{0}^{\infty} e^{tx} f(x) dx \\ =& \int_{0}^{\infty} e^{tx} \lambda e^{-\lambda x} dx \\ =& \lambda \int_{0}^{\infty} e^{(t - \lambda ) x} dx \\ =& \lambda {{ 1 } \over { t - \lambda }} [ 0 - 1 ] \\ =& {{ \lambda } \over { \lambda - t }} \end{align*} $$
■
[2]
■
[3]
■
[a]
Derived using conditional probability.
■
[b]
Shown by the moment generating function.
■
[c]
It’s self-evident from the probability density function.
■
Visualization
The following is Julia code showcasing the probability density function of the exponential distribution via an animated gif.
@time using LaTeXStrings
@time using Distributions
@time using Plots
cd(@__DIR__)
x = 0:0.1:10
Λ = collect(0.1:0.1:5.0); append!(Λ, reverse(Λ))
animation = @animate for λ ∈ Λ
plot(x, pdf.(Exponential(λ), x),
color = :black,
label = "λ = $(round(λ, digits = 2))", size = (400,300))
xlims!(0,10); ylims!(0,0.5); title!(L"\mathrm{pdf\,of\,} \exp(\lambda)")
end
gif(animation, "pdf.gif")
Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p159. ↩︎