logo

Poisson Distribution 📂Probability Distribution

Poisson Distribution

Definitions 1

pmf.gif

For λ>0\lambda > 0, we refer to the following probability mass function as the Poisson Distribution that has a discrete probability distribution Poi(λ)\text{Poi} ( \lambda ). p(x)=eλλxx!,x=0,1,2, p(x) = {{ e^{-\lambda} \lambda^{x} } \over { x! }} \qquad , x = 0 , 1 , 2, \cdots

Basic Properties

Moment Generating Function

  • [1]: m(t)=exp[λ(et1)],tRm(t) = \exp \left[ \lambda \left( e^{t} - 1 \right) \right] \qquad , t \in \mathbb{R}

Mean and Variance

  • [2]: If XPoi(λ)X \sim \text{Poi}(\lambda) then E(X)=λVar(X)=λ \begin{align*} E(X) =& \lambda \\ \Var(X) =& \lambda \end{align*}

Sufficient Statistic and Maximum Likelihood Estimation

  • [3]: Suppose we have a random sample X:=(X1,,Xn)Poi(p)\mathbf{X} := \left( X_{1} , \cdots , X_{n} \right) \sim \text{Poi} \left( p \right).

The sufficient statistic TT and maximum likelihood estimate λ^\hat{\lambda} for λ\lambda are as follows. T=k=1nXkλ^=1nk=1nXk \begin{align*} T =& \sum_{k=1}^{n} X_{k} \\ \hat{\lambda} =& {{ 1 } \over { n }} \sum_{k=1}^{n} X_{k} \end{align*}

Theorem

Derivation of Poisson Distribution as a Limit Distribution of Binomial Distribution

  • [a]: Let XnB(n,p)X_{n} \sim B(n,p).

If μnp\mu \approx np then XnDPoi(μ) X_{n} \overset{D}{\to} \text{Poi} (\mu)

Derivation of Standard Normal Distribution as a Limit Distribution of Poisson Distribution

  • [b]: If XnPoi(n)X_{n} \sim \text{Poi} \left( n \right) and Yn:=Xnnn\displaystyle Y_{n} := {{ X_{n} - n } \over { \sqrt{n} }} then YnDN(0,1) Y_{n} \overset{D}{\to} N(0,1)

Explanation

Naming

The probability mass function of Poisson distribution may seem complicated at first, but it actually originates from the series expansion of the exponential function. ex=1+x1!+x22!+x33!+ e^{x} = 1 + {{ x } \over { 1 ! }} + {{ x^{2} } \over { 2! }} + {{ x^{3} } \over { 3! }} + \cdots The parameter x=λx = \lambda is usually assumed to be fixed, so dividing both sides by constant eλe^{\lambda} yields 1=eλλ00!+eλλ11!+eλλ22!+eλλ33!+ 1 = {{ e^{-\lambda} \lambda^{0} } \over { 0! }} + {{ e^{-\lambda} \lambda^{1} } \over { 1! }} + {{ e^{-\lambda} \lambda^{2} } \over { 2! }} + {{ e^{-\lambda} \lambda^{3} } \over { 3! }} + \cdots Thus, the sum of all probability mass functions of the Poisson distribution is 11. Unlike binomial distribution, geometric distribution, or negative binomial distribution, the Poisson distribution doesn’t derive its name from its formula.

The great physicist and mathematician, Poisson, proposed in his 1837 paper Research on the Probability of Judgments in Criminal and Civil Matters that the probability of certain events occurring within a unit time follows a specific distribution. This distribution was named the Poisson distribution after him, and his name is still attached to numerous probability theories and statistical techniques.

Distribution with Equal Mean and Variance

Before various applications, the Poisson distribution itself is an interesting topic of research. One of the most notable properties of the Poisson distribution is that its mean and variance are equal to the parameter λ\lambda.

Relationship with Exponential Distribution

Meanwhile, the Poisson and exponential distributions focus on similar phenomena, with the former concerned with the number of events occurring in a unit time, and the latter with the time until an event occurs. This relationship between the two distributions led some texts to use the same Greek letter λ\lambda for both. Especially when considering that the mean of the Poisson distribution is λ\lambda, and the mean of the exponential distribution is 1λ\displaystyle {{ 1 } \over { \lambda }}, one can perceive the relationship between the two distributions as a kind of ‘inverse’.

Proof

[1]

m(t)=x=0netxp(x)=x=0netxλxeλx!=eλx=0n(etλ)xx!=eλeλet=exp[λ+λet]=exp[λ(et1)] \begin{align*} m(t) =& \sum_{x=0}^{n} e^{tx} p(x) \\ =& \sum_{x=0}^{n} e^{tx} {{ \lambda^{x} e^{-\lambda} } \over { x! }} \\ =& e^{-\lambda} \sum_{x=0}^{n} {{ \left( e^{t}\lambda \right)^{x} } \over { x! }} \\ =& e^{-\lambda} e^{\lambda e^{t}} \\ =& \exp \left[ -\lambda + \lambda e^{t} \right] \\ =& \exp \left[ \lambda ( e^{t} - 1) \right] \end{align*}

[2]

Direct deduction.

[3]

Direct deduction.

[a]

Approximation via moment generating function.

[b]

Approximation by omitting terms in Taylor expansion.

Code

Below is a Julia code that demonstrates the probability mass function of Poisson distribution in an animated GIF.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = 0:20
Λ = collect(1:0.1:10); append!(Λ, reverse(Λ))

animation = @animate for λ ∈ Λ
    scatter(x, pdf.(Poisson(λ), x),
     color = :black,
     label = "λ = $(round(λ, digits = 2))", size = (400,300))
    xlims!(0,10); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,Poi}(\lambda)")
end
gif(animation, "pmf.gif")

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p152. ↩︎