logo

Mean and Variance of Gamma Distribution 📂Probability Distribution

Mean and Variance of Gamma Distribution

Formula

Random Variable $X$ is said to be $X \sim \Gamma \left( k , \theta \right)$ with respect to the Gamma Distribution $\Gamma \left( k , \theta \right)$. $$ E(X) = k \theta \\ \operatorname{Var} (X) = k \theta^{2} $$

Derivation

Strategy: Directly infer using the definition of the gamma distribution and the basic properties of the gamma function. Use a trick to balance the numerator and denominator of the coefficients as the degree in $x$ changes.

Definition of Gamma Distribution: A continuous probability distribution with the following probability density function $\Gamma ( k , \theta )$ for $k, \theta > 0$ is called the Gamma Distribution. $$ f(x) = {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} \qquad , x > 0 $$

Recursive Formula of the Gamma Function: $$ \Gamma (p+1)=p\Gamma (p) $$

Mean

$$ \begin{align*} E(X) =& \int _{0} ^{\infty} x { 1 \over { \Gamma ( k ) \theta^k } } x^{k– 1} e^{ - {{x} \over {\theta }} } dx \\ =& \int _{0} ^{\infty} { {k \theta} \over { \Gamma (k+1) \theta^{k+1} } } x^{k} e^{ - {{x} \over {\theta}} } dx \\ =& {k \theta} \end{align*} $$

Variance

$$ \begin{align*} E( X^2 ) =& \int _{0} ^{\infty} x^2 { 1 \over { \Gamma (k) \theta^k} } x^{k– 1} e^{ - {{x} \over {\theta}} } dx \\ =& \int _{0} ^{\infty} { {k (k+ 1) \theta^2 } \over { \Gamma (k+2) \theta^{k+2} } } x^{k+1} e^{ - {{x} \over {\theta}} } dx \\ =& {k^2 \theta^2 + k \theta^2} \end{align*} $$ Therefore, $$ \begin{align*} \operatorname{Var}(X) =& (k^2 \theta^2 + k \theta^2) - (k \theta)^2 \\ =& k \theta^2 \end{align*} $$