logo

Gamma Distribution 📂Probability Distribution

Gamma Distribution

Definition 1

pdf1 pdf2 pdf4

For $k, \theta > 0$, it is called the Gamma Distribution which has the following probability density function $\Gamma ( k , \theta )$. $$ f(x) = {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} \qquad , x > 0 $$


  • $\Gamma$ represents the Gamma function.
  • The probability density function of the Gamma distribution can also be defined as follows for $\alpha , \beta > 0$. Essentially, it’s just a matter of whether it’s $\theta = {{ 1 } \over { \beta }}$. $$ f(x) = {{ \beta^{\alpha } } \over { \Gamma ( \alpha ) }} x^{\alpha - 1} e^{ - \beta x} \qquad , x > 0 $$

Fundamental Properties

Moment Generating Function

  • [1]: $$m(t) = \left( 1 - \theta t\right)^{-k} \qquad , t < {{ 1 } \over { \theta }}$$

Mean and Variance

  • [2]: If $X \sim \Gamma ( \alpha , \beta )$, then $$ \begin{align*} E(X) =& k \theta \\ \text{Var} (X) =& k \theta^{2} \end{align*} $$

Sufficient Statistic

  • [3]: Let’s say we have a random sample $\mathbf{X} := \left( X_{1} , \cdots , X_{n} \right) \sim \Gamma \left( k, \theta \right)$ that follows a Gamma distribution.

The sufficient statistic for $\left( k, \theta \right)$ is as follows: $T$. $$ T = \left( \prod_{i} X_{i}, \sum_{i} X_{i} \right) $$

Theorems

Scaling

  • [a]: If $X \sim \Gamma ( k , \theta )$, then for scalar $c > 0$, $c X \sim \Gamma ( k , c \theta )$

Relationship with Poisson Distribution

  • [b]: For all natural numbers $k$, $$ \int_{\mu}^{\infty} { { z^{k-1} e^{-z} } \over { \Gamma (k) } } dz = \sum_{x=0}^{k-1} { { {\mu}^{x} e^{-\mu} } \over {x!} } $$

Relationship with Exponential Distribution

  • [c]: $$\Gamma \left(1, { 1 \over \lambda } \right) \iff \text{exp} (\lambda)$$

Relationship with Chi-squared Distribution

  • [d]: $$\Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r)$$

Derivation of Beta Distribution

  • [e]: If two random variables $X_{1},X_{2}$ are independent and $X_{1} \sim \Gamma ( \alpha_{1} , 1)$, $X_{2} \sim \Gamma ( \alpha_{2} , 1)$, then $$ {{ X_{1} } \over { X_{1} + X_{2} }} \sim \text{beta} \left( \alpha_{1} , \alpha_{2} \right) $$

Explanation

The Gamma Distribution is a function named after the Gamma function, and the fact that the integral of its probability density function equates to $1$ originates from Euler integrals. Rather than having an intuitive meaning, it’s artificially derived due to its statistically useful properties. Such distributions are also referred to as Sampling Distributions, and the Gamma distribution, thanks to its unique shape, takes on various forms and provides many convenient properties.

Bayesian

In Bayesian analysis, it is also used as the conjugate prior distribution for the Poisson distribution.

Proofs

[1]

When $\displaystyle t < {{ 1 } \over { \theta }}$, let $\displaystyle y := x {{ ( 1 - \theta t ) } \over { \theta }}$ then $\displaystyle dy = {{ ( 1 - \theta t ) } \over { \theta }} dx$, $$ \begin{align*} m(t) =& \int_{0}^{\infty} e^{tx} {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} dx \\ =& \int_{0}^{\infty} {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ x (t - 1 / \theta) } dx \\ =& \int_{0}^{\infty} {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x {{( 1 - \theta t)} \over {\theta}} } dx \\ =& \int_{0}^{\infty} {{ 1 } \over { \Gamma ( k ) \theta^{k} }} \left( {{ y \theta } \over { 1 - \theta t }} \right)^{k - 1} e^{ - y } {{ \theta } \over { 1 - \theta t }}dy \\ =& \left( {{ 1 } \over { 1 - \theta t }} \right)^{k } \int_{0}^{\infty} {{ \theta^{k} } \over { \Gamma ( k ) \theta^{k} }} y^{k-1} e^{ - y } dy \end{align*} $$ According to Euler’s integration, $\displaystyle \int_{0}^{\infty} {{ 1 } \over { \Gamma ( k ) }} y^{k-1} e^{ - y } dy = 1$. $$ m(t) = \left( 1 - \theta t\right)^{-k} \qquad , t < {{ 1 } \over { \theta }} $$

[2]

Direct deduction.

[3]

Direct deduction.

[a]

If we set $X \sim \Gamma ( k , \theta )$ and $c >0$, then $Y = c X$, $$ \begin{align*} m_{X}(t) =& \int_{0}^{\infty} e^{tx} {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} dx \\ =& \int_{0}^{\infty} e^{tx} {{ c^{k} } \over { \Gamma ( k ) (c\theta)^{k} }} x^{k - 1} e^{ - cx / c\theta} dx \\ =& \int_{0}^{\infty} e^{{{ t } \over { c }} cx} {{ 1 } \over { \Gamma ( k ) (c\theta)^{k} }} (cx)^{k - 1} e^{ - cx / c\theta} c dx \\ =& \int_{0}^{\infty} e^{{{ t } \over { c }} y} {{ 1 } \over { \Gamma ( k ) (c\theta)^{k} }} y^{k - 1} e^{ - y / c\theta} dy \end{align*} $$ According to [1] the moment generating function, $$ \begin{align*} m_{Y}(t) =& E \left( e^{tY} \right) \\ =& E \left( e^{tcX} \right) \\ =& \int_{0}^{\infty} e^{{{ tc } \over { c }} y} {{ 1 } \over { \Gamma ( k ) (c\theta)^{k} }} y^{k - 1} e^{ - y / c\theta} dy \\ =& \int_{0}^{\infty} e^{tz} {{ 1 } \over { \Gamma ( k ) (c\theta)^{k} }} z^{k - 1} e^{ - z / c\theta} dz \\ =& (1 - c \theta)^{-k} \end{align*} $$ Therefore, $Y \sim \Gamma ( k , c \theta)$.

[b]

Shown by mathematical induction.

[c]

Shown by the moment generating function.

[d]

Shown by the moment generating function.

Code

Here is a Julia code that displays the probability density function of the gamma distribution as an animated gif.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = 0:0.1:20
Θ = collect(0.1:0.1:10.0); append!(Θ, reverse(Θ))

animation = @animate for θ ∈ Θ
    plot(x, pdf.(Gamma(1, θ), x),
     color = :black,
     label = "r = 1, θ = $(rpad(θ, 4, '0'))", size = (400,300))
    xlims!(0,20); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,} \Gamma (1, \theta)")
end
gif(animation, "pdf1.gif")

animation = @animate for θ ∈ Θ
    plot(x, pdf.(Gamma(2, θ), x),
     color = :black,
     label = "r = 2, θ = $(rpad(θ, 4, '0'))", size = (400,300))
    xlims!(0,20); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,} \Gamma (2, \theta)")
end
gif(animation, "pdf2.gif")

animation = @animate for θ ∈ Θ
    plot(x, pdf.(Gamma(4, θ), x),
     color = :black,
     label = "r = 4, θ = $(rpad(θ, 4, '0'))", size = (400,300))
    xlims!(0,20); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,} \Gamma (4, \theta)")
end
gif(animation, "pdf4.gif")

  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p158. ↩︎