logo

Chi-Squared Distribution 📂Probability Distribution

Chi-Squared Distribution

Definition 1

The chi-square distribution refers to a continuous probability distribution $\chi^{2} (r)$ with the following probability density function, defined over the degrees of freedom $r > 0$. $$ f(x) = {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} x^{r/2-1} e^{-x/2} \qquad , x \in (0, \infty) $$


Basic Properties

Moment Generating Function

  • [1]: $$m(t) = (1-2t)^{-r/2} \qquad , t < {{ 1 } \over { 2 }}$$

Mean and Variance

  • [2] If the mean and variance are: $X \sim \chi^{2} (r)$, then $$ \begin{align*} E(X) =& r \\ \operatorname{Var} (X) =& 2r \end{align*} $$

Sufficient Statistics

  • [3]: Suppose a random sample $\mathbf{X} := \left( X_{1} , \cdots , X_{n} \right) \sim \chi^{2} (r)$ following the chi-square distribution is given. The sufficient statistic $T$ for $r$ is as follows. $$ T = \left( \prod_{i} X_{i} \right) $$

Theorems

$k$th Moment

  • [a]: Let $X \sim \chi^{2} (r)$. If $k > - r/ 2$, then the $k$th moment exists and $$ E X^{k} = {{ 2^{k} \Gamma (r/2 + k) } \over { \Gamma (r/2) }} $$

Relationship with the Gamma Distribution

  • [b]: $$\Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r)$$

Derivation of the F-Distribution

  • [c]: If two random variables $U,V$ are independent and $U \sim \chi^{2} ( r_{1})$, $V \sim \chi^{2} ( r_{2})$, then $$ {{ U / r_{1} } \over { V / r_{2} }} \sim F \left( r_{1} , r_{2} \right) $$

Relationship with the Square of a Standard Normal Distribution

  • [d]: If $X \sim N(\mu,\sigma ^2)$, then $$ V=\left( { X - \mu \over \sigma} \right) ^2 \sim \chi ^2 (1) $$

Explanation

The chi-square distribution is widely used throughout statistics, often first encountered in goodness-of-fit tests or analysis of variance, among others.

Theorem [d] is particularly important as the converse of this theorem allows detecting issues with the normality of residuals when the squared standardized residuals do not follow the chi-square distribution $\chi^{2} (1)$.

Proof

Strategy [1], [a]: Use the trick of taking stuff out of the definite integral sign and changing it to a gamma function through substitution integration.

Definition of the Gamma Function: $$ \Gamma (x) := \int_{0}^{\infty} y^{x-1} e^{y} dy $$

[1]

By substituting as $y=x(1/2-t)$, since ${{ 1 } \over { 1/2 - t }}dy = dx$ $$ \begin{align*} m(t) =& \int_{0}^{\infty} e^{tx} {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} x^{r/2-1} e^{-x/2} dx \\ =& {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} \int_{0}^{\infty} x^{r/2-1} e^{x(1/2-t)} dx \\ =& {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} \int_{0}^{\infty} \left( {{ y } \over { 1/2 -t }} \right)^{r/2-1} e^{y} {{ 1 } \over { 1/2 - t }} dy \\ =& (1/2-t)^{-r/2}{{ 1 } \over { \Gamma (r/2) 2^{r/2} }} \int_{0}^{\infty} y^{r/2-1} e^{y} dy \\ =& (1-2t)^{-r/2}{{ 1 } \over { \Gamma (r/2) }} \int_{0}^{\infty} y^{r/2-1} e^{y} dy \end{align*} $$ according to the definition of the gamma function $$ m(t) = (1-2t)^{-r/2} \qquad , t < {{ 1 } \over { 2 }} $$

[2]

Substituting into the moment formula [a].

[a]

By substituting as $y = x/2$, since $2 dy = dx$ $$ \begin{align*} EX^{k} =& \int_{0}^{\infty} x^{k} {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} x^{r/2-1} e^{-x/2} dx \\ =& {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} \int_{0}^{\infty} x^{r/2+k-1} e^{-x/2} dx \\ =& {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} \int_{0}^{\infty} 2^{r/2+k-1} y^{r/2+k-1} e^{-y} 2dy \\ =& {{ 2^{k} } \over { \Gamma (r/2) }} \int_{0}^{\infty} y^{(r/2+k)-1} e^{-y} 2dy \end{align*} $$ according to the definition of the gamma function $$ E X^{k} = {{ 2^{k} \Gamma (r/2 + k) } \over { \Gamma (r/2) }} $$

[b]

Shown through the moment-generating function.

[c]

Deduced directly from the joint density function.

[d]

Deduced directly from the probability density function.

See Also

Generalization: Non-central Chi-square Distribution


  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p161. ↩︎