logo

The Square of a Standard Normal Distribution Follows a Chi-Square Distribution with One Degree of Freedom 📂Probability Distribution

The Square of a Standard Normal Distribution Follows a Chi-Square Distribution with One Degree of Freedom

Theorem

If XN(μ,σ2)X \sim N(\mu,\sigma ^2) then V=(Xμσ)2χ2(1) V=\left( { X - \mu \over \sigma} \right) ^2 \sim \chi ^2 (1)


  • N(μ,σ2)N \left( \mu , \sigma^{2} \right) is a normal distribution with mean μ\mu and variance σ2\sigma^{2}.
  • χ2(1)\chi^{2} \left( 1 \right) is a chi-squared distribution with degrees of freedom 11.

Description

In general, Student’s theorem is widely used to generalize this.

Anyone studying statistics must always know as a fact that the square of the standard normal distribution follows a chi-squared distribution. When one can assume some data follows a normal distribution, if the variance of the standardized data is excessively high or low, one can immediately guess there is an issue. Naturally, this is applied in many statistical tests, and having or lacking theoretical intuition on this is as different as heaven and earth.

Conversely, it is more common sense to explore what distribution the square of data following a standard normal distribution, perhaps the square of residuals, follows rather than first recalling the definition of chi-squared distribution and exploring its properties.

Proof 1

If we set W:=(Xμ)σ\displaystyle W := {(X-\mu) \over \sigma } then we get WN(0,1)W \sim N(0,1).

Definition of standard normal distribution: A normal distribution N(0,12)N \left( 0,1^{2} \right) with the following probability density function is called a standard normal distribution. f(z)=12πexp[z22] f(z) = {{ 1 } \over { \sqrt{2 \pi} }} \exp \left[ - {{ z^{2} } \over { 2 }} \right]

If the cumulative distribution function of VV is denoted by FF, then F(v)=P(Vv)=P(W2v)=P(vWv)=vv12πew22dw=20v12πew22dw \begin{align*} F(v) =& P(V \le v) \\ =& P \left( W^2 \le v \right) \\ =& P \left( \sqrt{v} \le W \le \sqrt{v} \right) \\ =& \int_{-\sqrt{v}}^{\sqrt{v}} { 1 \over \sqrt{ 2 \pi } } e^{-{{w^2} \over 2}} dw \\ =& 2 \int_{0}^{\sqrt{v}} { 1 \over \sqrt{ 2 \pi } } e^{-{{w^2} \over 2}} dw \end{align*} Substituting with w:=xw := \sqrt{x} yields F(v)=20v12πex212xdx F(v) = 2\int_{0}^{v} { 1 \over \sqrt{ 2 \pi } } e^{-{{x} \over 2}} {1 \over {2 \sqrt{x} } } dx According to the fundamental theorem of calculus, the probability density function ff of vv is f(v)=F(v)=12πev21v12 f(v) = F ' (v) = { 1 \over {\sqrt{ 2 \pi } } } e^{-{{v} \over 2}}{ 1 \over {v^{1 \over 2}} }

Euler’s reflection formula: Γ(1x)Γ(x)=πsinπx {\Gamma (1-x) \Gamma ( x )} = { {\pi} \over {\sin \pi x } }

By the reflection formula, since π=Γ(12)\displaystyle \sqrt{\pi} = \Gamma \left( {{ 1 } \over { 2 }} \right) f(v)=1Γ(12)212v12ev2 f(v) = { 1 \over { \Gamma ({1 \over 2}) 2^{1 \over 2} } } v^{ - {1 \over 2} } e^{-{{v} \over 2}}

Definition of gamma distribution: A continuous probability distribution Γ(k,θ)\Gamma ( k , \theta ) with the following probability density function for k,θ>0k, \theta > 0 is called a gamma distribution. f(x)=1Γ(k)θkxk1ex/θ,x>0 f(x) = {{ 1 } \over { \Gamma ( k ) \theta^{k} }} x^{k - 1} e^{ - x / \theta} \qquad , x > 0

In conclusion, VV has the probability density function of a gamma distribution Γ(12,2)\displaystyle \Gamma \left( {{ 1 } \over { 2 }} , 2 \right).

Relationship between gamma and chi-squared distributions: Γ(r2,2)    χ2(r) \Gamma \left( { r \over 2 } , 2 \right) \iff \chi ^2 (r)

Therefore, Γ(12,2)χ2(1)\displaystyle \Gamma \left( {1 \over 2}, 2 \right) \sim \chi^2 (1) and (Xμσ)2χ2(1) \left( { X - \mu \over \sigma} \right) ^2 \sim \chi ^2 (1)


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): 175-176. ↩︎