Derivation of the Student's t-Distribution from Independent Normal Distributions and the Chi-Squared Distribution
Theorem
Two independent random variables $W,V$ where $W \sim N(0,1)$ and $V \sim \chi^{2} (r)$, then $$ T = { {W} \over {\sqrt{V/r} } } \sim t(r) $$
- $N \left( \mu , \sigma^{2} \right)$ is a normal distribution with mean $\mu$ and variance $\sigma^{2}$.
- $\chi^{2} \left( r \right)$ is a chi-squared distribution with degrees of freedom $r$.
- $t(r)$ is a t-distribution with degrees of freedom $r$.
Description
If this theorem is approached solely through statistics, it appears more practical and historically closer to the definition of the t-distribution.
Derivation1
Strategy: Direct deduction through the joint probability density function.
Definition of normal distribution: A continuous probability distribution $N \left( \mu,\sigma^{2} \right)$ with the following probability density function for $\mu \in \mathbb{R}$ and $\sigma > 0$ is called a normal distribution. $$ f(x) = {{ 1 } \over { \sqrt{2 \pi} \sigma }} \exp \left[ - {{ 1 } \over { 2 }} \left( {{ x - \mu } \over { \sigma }} \right)^{2} \right] \qquad, x \in \mathbb{R} $$
Definition of chi-squared distribution: A continuous probability distribution $\chi^{2} (r)$ with the following probability density function for degrees of freedom $r > 0$ is called a chi-squared distribution. $$ f(x) = {{ 1 } \over { \Gamma (r/2) 2^{r/2} }} x^{r/2-1} e^{-x/2} \qquad , x \in (0, \infty) $$
Since the probability density functions of $W,V$ $f_{1} , f_{2}$ are given as $$ f_1 (w) := { {1} \over {\sqrt{2 \pi }} } e ^{- w^{2} / 2} \\ \displaystyle f_2 (v) ={ 1 \over { \Gamma ({r \over 2}) 2^{r \over 2} } } v^{ {r \over 2} - 1 } e^{-{{v} \over 2}} $$ the joint probability density function of $W$ and $V$ $h$ for $w \in \mathbb{R}$ and $v \in (0,\infty)$ is as follows: $$ h(w,v) = { {1} \over {\sqrt{2 \pi }} } e ^{- w^{2} / 2} { 1 \over { \Gamma ({r \over 2}) 2^{r \over 2} } } v^{ {r \over 2} - 1 } e^{-{{v} \over 2}} $$ Now, if $\displaystyle T := { {W} \over {\sqrt{V/r} } }$ and $U := V$, then $w = t\sqrt{u} / \sqrt{r}$ and $v = u$, therefore $$ \left| J \right| = \begin{vmatrix} {{\sqrt{u}} \over {\sqrt{r}}} & 0 \\ {{t} \over {2 \sqrt{ur}}} & 1 \end{vmatrix} = \sqrt{{{ u } \over { r }}} $$ Thus, the joint probability density function of $T, U$ is $$ \begin{align*} g(t,u) =& h({ {w} \over {\sqrt{v/r} } },u) |J| \\ =& { {1} \over {\sqrt{2 \pi } \Gamma (r/2) 2^{r/2} } } u^{r/2 -1} \exp \left\{ -{{u} \over {2} } \left( 1 + { {t^2} \over {r} } \right) \right\} { {\sqrt{u} } \over {\sqrt{r} } } \end{align*} $$ The marginal probability density function of $T$ is $$ \begin{align*} g(t) =& \int_{-\infty}^{\infty} g(t,u) du \\ =& \int_{0}^{\infty} { {1} \over {\sqrt{2 \pi r} \Gamma (r/2) 2^{r/2} } } u^{(r+1)/2 -1} \exp \left\{ -{{u} \over {2} } \left( 1 + { {t^2} \over {r} } \right) \right\} du \end{align*} $$ Substituting with $\displaystyle z := {{u} \over {2}} \left( 1 + {{t^2} \over {r}} \right)$ gives $$ \begin{align*} g(t) =& \int_{0}^{\infty} { {1} \over {\sqrt{2 \pi r} \Gamma (r/2) 2^{r/2} } } \left( { {2z} \over {1 + t^2 / r} }\right)^{(r+1)/2-1} e^{-z} \left( { {2} \over {1+ t^2 / r} } \right) dz \\ =& { {1} \over {\sqrt{\pi r} \Gamma (r/2) } } \int_{0}^{\infty} {{ 1 } \over { \sqrt{2} 2^{r/2} }}z^{(r+1)/2-1} \left( { {2} \over {1 + t^2 / r} }\right)^{(r+1)/2-1} e^{-z} \left( { {2} \over {1+ t^2 / r} } \right) dz \\ =& { {1} \over {\sqrt{\pi r} \Gamma (r/2) } } \int_{0}^{\infty} {{ 1 } \over { 2^{(r+1)/2} }}z^{(r+1)/2-1} \left( { {2} \over {1 + t^2 / r} }\right)^{(r+1)/2} e^{-z} dz \\ =& { {1} \over {\sqrt{\pi r} \Gamma (r/2) } } \int_{0}^{\infty}z^{(r+1)/2-1} \left( { {1} \over {1 + t^2 / r} }\right)^{(r+1)/2} e^{-z} {{ \Gamma \left( (r+1)/2 \right) } \over { \Gamma \left( (r+1)/2 \right) }} dz \\ =& { {\Gamma \left( (r+1)/2 \right)} \over {\sqrt{\pi r} \Gamma (r/2) } } \left( { {1} \over {1 + t^2 / r} }\right)^{(r+1)/2} \int_{0}^{\infty} {{ 1 } \over { \Gamma \left( (r+1)/2 \right) }} z^{(r+1)/2-1} e^{-z} dz \\ =& { {\Gamma \left( (r+1)/2 \right)} \over {\sqrt{\pi r} \Gamma (r/2) } } \left( { {1} \over {1 + t^2 / r} }\right)^{(r+1)/2} \cdot 1 \end{align*} $$ The integrand becomes the probability density function of the gamma distribution $\Gamma \left( {{ r + 1 } \over { 2 }} , 1 \right) $, avoiding complex calculations. Upon simplification, $$ g(t) = {{\Gamma ( (r+1)/2 ) } \over { \sqrt{\pi r} \Gamma (r/2) }} { {1} \over {(1 + t^{2} / r)^{(r+1)/2} } } $$ This is the probability density function of the t-distribution with degrees of freedom $r$. $$ T = { {W} \over {\sqrt{V/r} } } \sim t(r) $$
■
Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): 191-192. ↩︎