logo

Poisson Distribution 📂Probability Distribution

Poisson Distribution

Definitions 1

pmf.gif

For $\lambda > 0$, we refer to the following probability mass function as the Poisson Distribution that has a discrete probability distribution $\text{Poi} ( \lambda )$. $$ p(x) = {{ e^{-\lambda} \lambda^{x} } \over { x! }} \qquad , x = 0 , 1 , 2, \cdots $$

Basic Properties

Moment Generating Function

  • [1]: $$m(t) = \exp \left[ \lambda \left( e^{t} - 1 \right) \right] \qquad , t \in \mathbb{R}$$

Mean and Variance

  • [2]: If $X \sim \text{Poi}(\lambda)$ then $$ \begin{align*} E(X) =& \lambda \\ \operatorname{Var}(X) =& \lambda \end{align*} $$

Sufficient Statistic and Maximum Likelihood Estimation

  • [3]: Suppose we have a random sample $\mathbf{X} := \left( X_{1} , \cdots , X_{n} \right) \sim \text{Poi} \left( p \right)$.

The sufficient statistic $T$ and maximum likelihood estimate $\hat{\lambda}$ for $\lambda$ are as follows. $$ \begin{align*} T =& \sum_{k=1}^{n} X_{k} \\ \hat{\lambda} =& {{ 1 } \over { n }} \sum_{k=1}^{n} X_{k} \end{align*} $$

Theorem

Derivation of Poisson Distribution as a Limit Distribution of Binomial Distribution

  • [a]: Let $X_{n} \sim B(n,p)$.

If $\mu \approx np$ then $$ X_{n} \overset{D}{\to} \text{Poi} (\mu) $$

Derivation of Standard Normal Distribution as a Limit Distribution of Poisson Distribution

  • [b]: If $X_{n} \sim \text{Poi} \left( n \right)$ and $\displaystyle Y_{n} := {{ X_{n} - n } \over { \sqrt{n} }}$ then $$ Y_{n} \overset{D}{\to} N(0,1) $$

Explanation

Naming

The probability mass function of Poisson distribution may seem complicated at first, but it actually originates from the series expansion of the exponential function. $$ e^{x} = 1 + {{ x } \over { 1 ! }} + {{ x^{2} } \over { 2! }} + {{ x^{3} } \over { 3! }} + \cdots $$ The parameter $x = \lambda$ is usually assumed to be fixed, so dividing both sides by constant $e^{\lambda}$ yields $$ 1 = {{ e^{-\lambda} \lambda^{0} } \over { 0! }} + {{ e^{-\lambda} \lambda^{1} } \over { 1! }} + {{ e^{-\lambda} \lambda^{2} } \over { 2! }} + {{ e^{-\lambda} \lambda^{3} } \over { 3! }} + \cdots $$ Thus, the sum of all probability mass functions of the Poisson distribution is $1$. Unlike binomial distribution, geometric distribution, or negative binomial distribution, the Poisson distribution doesn’t derive its name from its formula.

The great physicist and mathematician, Poisson, proposed in his 1837 paper Research on the Probability of Judgments in Criminal and Civil Matters that the probability of certain events occurring within a unit time follows a specific distribution. This distribution was named the Poisson distribution after him, and his name is still attached to numerous probability theories and statistical techniques.

Distribution with Equal Mean and Variance

Before various applications, the Poisson distribution itself is an interesting topic of research. One of the most notable properties of the Poisson distribution is that its mean and variance are equal to the parameter $\lambda$.

Relationship with Exponential Distribution

Meanwhile, the Poisson and exponential distributions focus on similar phenomena, with the former concerned with the number of events occurring in a unit time, and the latter with the time until an event occurs. This relationship between the two distributions led some texts to use the same Greek letter $\lambda$ for both. Especially when considering that the mean of the Poisson distribution is $\lambda$, and the mean of the exponential distribution is $\displaystyle {{ 1 } \over { \lambda }}$, one can perceive the relationship between the two distributions as a kind of ‘inverse’.

Proof

[1]

$$ \begin{align*} m(t) =& \sum_{x=0}^{n} e^{tx} p(x) \\ =& \sum_{x=0}^{n} e^{tx} {{ \lambda^{x} e^{-\lambda} } \over { x! }} \\ =& e^{-\lambda} \sum_{x=0}^{n} {{ \left( e^{t}\lambda \right)^{x} } \over { x! }} \\ =& e^{-\lambda} e^{\lambda e^{t}} \\ =& \exp \left[ -\lambda + \lambda e^{t} \right] \\ =& \exp \left[ \lambda ( e^{t} - 1) \right] \end{align*} $$

[2]

Direct deduction.

[3]

Direct deduction.

[a]

Approximation via moment generating function.

[b]

Approximation by omitting terms in Taylor expansion.

Code

Below is a Julia code that demonstrates the probability mass function of Poisson distribution in an animated GIF.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = 0:20
Λ = collect(1:0.1:10); append!(Λ, reverse(Λ))

animation = @animate for λ ∈ Λ
    scatter(x, pdf.(Poisson(λ), x),
     color = :black,
     label = "λ = $(round(λ, digits = 2))", size = (400,300))
    xlims!(0,10); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,Poi}(\lambda)")
end
gif(animation, "pmf.gif")

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p152. ↩︎