logo

Mean and Variance of the Poisson Distribution 📂Probability Distribution

Mean and Variance of the Poisson Distribution

Formulas

XPoi(λ)X \sim \text{Poi}(\lambda) Surface E(X)=λVar(X)=λ E(X) = \lambda \\ \Var(X) = \lambda

Derivation

Strategy: Directly deduce from the definition of the Poisson distribution. The trick of splitting factorials and series is important.

Definition of Poisson Distribution: For λ>0\lambda > 0, an discrete probability distribution that has the following probability mass function Poi(λ)\text{Poi} ( \lambda ) is called a Poisson distribution. p(x)=eλλxx!,x=0,1,2, p(x) = {{ e^{-\lambda} \lambda^{x} } \over { x! }} \qquad , x = 0 , 1 , 2, \cdots

Mean

E(X)=x=0xλxeλx!=eλx=0xλxx!=eλx=1λλx1(x1)!=λeλx=1λx1(x1)!=λeλeλ=λ \begin{align*} E(X) =& \sum _{ x=0 }^{ \infty }{ x\frac { { \lambda ^ x }{ e ^ { - \lambda } } }{ x! } } \\ =& { e ^ { - \lambda } }\sum _{ x=0 }^{ \infty }{ x\frac { { \lambda ^ x } }{ x! } } \\ =& { e ^ { - \lambda } }\sum _{ x=1 }^{ \infty }{ \frac { { \lambda \cdot \lambda } ^{ x-1 } }{ (x-1)! } } \\ =& { \lambda e } ^{ -\lambda } \sum _{ x=1 }^{ \infty }{ \frac { { \lambda } ^{ x-1 } }{ (x-1)! } } \\ =& \lambda { e ^ { - \lambda } }{ e^ \lambda } \\ =& \lambda \end{align*}

Variance

E(X2)=x=0x2λxeλx!=eλx=1xλx(x1)!=eλx=1(x1+1)λx(x1)!=eλx=1(x1)λx+λx(x1)!=eλx=1{(x1)λx(x1)!+λx(x1)!}=eλ{x=2λx(x2)!+x=1λx(x1)!}=eλ{x=2λ2λx2(x2)!+x=1λλx1(x1)!}=eλ(λ2eλ+λeλ)=λ2+λ \begin{align*} E({ X }^{ 2 }) =& \sum _{ x=0 }^{ \infty }{ { x } ^{ 2 } \frac { { \lambda ^ x }{ e ^ { - \lambda } } }{ x! } } \\ =& { e ^ { - \lambda } }\sum _{ x=1 }^{ \infty }{ x\frac { { \lambda ^ x } }{ (x-1)! } } \\ =& { e ^ { - \lambda } }\sum _{ x=1 }^{ \infty }{ \frac { { (x-1+1)\lambda } ^{ x } }{ (x-1)! } } \\ =& { e ^ { - \lambda } }\sum _{ x=1 }^{ \infty }{ \frac { { (x-1)\lambda } ^{ x }+{ \lambda ^ x } }{ (x-1)! } } \\ =& { e ^ { - \lambda } }\sum _{ x=1 }^{ \infty }{ \left\{ \frac { { (x-1)\lambda } ^{ x } }{ (x-1)! }+\frac { { \lambda ^ x } }{ (x-1)! } \right\} } \\ =& { e ^ { - \lambda } }\left\{ \sum _{ x=2 }^{ \infty }{ \frac { { \lambda ^ x } }{ (x-2)! } }+\sum _{ x=1 }^{ \infty }{ \frac { { \lambda ^ x } }{ (x-1)! } } \right\} \\ =& { e ^ { - \lambda } }\left\{ \sum _{ x=2 }^{ \infty }{ \frac { { { \lambda ^ 2 }\cdot \lambda } ^{ x-2 } }{ (x-2)! } }+\sum _{ x=1 }^{ \infty }{ \frac { { \lambda \cdot \lambda } ^{ x-1 } }{ (x-1)! } } \right\} \\ =& { e ^ { - \lambda } }({ \lambda ^ 2 }{ e^ \lambda }+\lambda { e^ \lambda }) \\ =& { \lambda ^ 2 }+\lambda \end{align*} Therefore Var(X)=E(X2){E(X)}2=(λ2+λ)λ2=λ \Var (X)=E({ X }^{ 2 })-{ \left\{ E(X) \right\} } ^{ 2 } ={ (\lambda }^{ 2 }+\lambda )-{ \lambda }^{ 2 }=\lambda