logo

The Square of a Standard Normal Distribution Follows a Chi-Square Distribution with One Degree of Freedom 📂Probability Distribution

The Square of a Standard Normal Distribution Follows a Chi-Square Distribution with One Degree of Freedom

Theorem

If $X \sim N(\mu,\sigma ^2)$ then $$ V=\left( { X - \mu \over \sigma} \right) ^2 \sim \chi ^2 (1) $$


Description

In general, Student’s theorem is widely used to generalize this.

Anyone studying statistics must always know as a fact that the square of the standard normal distribution follows a chi-squared distribution. When one can assume some data follows a normal distribution, if the variance of the standardized data is excessively high or low, one can immediately guess there is an issue. Naturally, this is applied in many statistical tests, and having or lacking theoretical intuition on this is as different as heaven and earth.

Conversely, it is more common sense to explore what distribution the square of data following a standard normal distribution, perhaps the square of residuals, follows rather than first recalling the definition of chi-squared distribution and exploring its properties.

Proof 1

If we set $\displaystyle W := {(X-\mu) \over \sigma }$ then we get $W \sim N(0,1)$.

Definition of standard normal distribution: A normal distribution $N \left( 0,1^{2} \right)$ with the following probability density function is called a standard normal distribution. $$ f(z) = {{ 1 } \over { \sqrt{2 \pi} }} \exp \left[ - {{ z^{2} } \over { 2 }} \right] $$

If the cumulative distribution function of $V$ is denoted by $F$, then $$ \begin{align*} F(v) =& P(V \le v) \\ =& P \left( W^2 \le v \right) \\ =& P \left( \sqrt{v} \le W \le \sqrt{v} \right) \\ =& \int_{-\sqrt{v}}^{\sqrt{v}} { 1 \over \sqrt{ 2 \pi } } e^{-{{w^2} \over 2}} dw \\ =& 2 \int_{0}^{\sqrt{v}} { 1 \over \sqrt{ 2 \pi } } e^{-{{w^2} \over 2}} dw \end{align*} $$ Substituting with $w := \sqrt{x}$ yields $$ F(v) = 2\int_{0}^{v} { 1 \over \sqrt{ 2 \pi } } e^{-{{x} \over 2}} {1 \over {2 \sqrt{x} } } dx $$ According to the fundamental theorem of calculus, the probability density function $f$ of $v$ is $$ f(v) = F ' (v) = { 1 \over {\sqrt{ 2 \pi } } } e^{-{{v} \over 2}}{ 1 \over {v^{1 \over 2}} } $$

Euler’s reflection formula: $$ {\Gamma (1-x) \Gamma ( x )} = { {\pi} \over {\sin \pi x } } $$

By the reflection formula, since $\displaystyle \sqrt{\pi} = \Gamma \left( {{ 1 } \over { 2 }} \right)$ $$ f(v) = { 1 \over { \Gamma ({1 \over 2}) 2^{1 \over 2} } } v^{ - {1 \over 2} } e^{-{{v} \over 2}} $$

Definition of gamma distribution: A continuous probability distribution $\Gamma ( k , \theta )$ with the following probability density function for $k, \theta > 0$ is called a gamma distribution. $$ f(x) = {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} \qquad , x > 0 $$

In conclusion, $V$ has the probability density function of a gamma distribution $\displaystyle \Gamma \left( {{ 1 } \over { 2 }} , 2 \right)$.

Relationship between gamma and chi-squared distributions: $$ \Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r) $$

Therefore, $\displaystyle \Gamma \left( {1 \over 2}, 2 \right) \sim \chi^2 (1)$ and $$ \left( { X - \mu \over \sigma} \right) ^2 \sim \chi ^2 (1) $$


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): 175-176. ↩︎