logo

Central Limit Theorem Proof 📂Mathematical Statistics

Central Limit Theorem Proof

Theorem 1

If $\left\{ X_{k} \right\}_{k=1}^{n}$ are iid random variables following the probability distribution $\left( \mu, \sigma^2 \right) $, then when $n \to \infty$ $$ \sqrt{n} {{ \overline{X}_n - \mu } \over {\sigma}} \overset{D}{\to} N (0,1) $$


Explanation

This theorem is widely acclaimed in statistics, along with the Law of Large Numbers. Despite being frequently discussed and applied, many encounter its proof only upon studying mathematical statistics. However, the proof itself is interesting, making the theorem even more valuable.

Proof

Strategy: Use tricks involving the moment generating function and Taylor’s theorem.


Firstly, assume that the moment generating function $M(t) = E(e^{t Y}), -h<t<h$ of $\displaystyle Y := \sqrt{n} {{ \overline{X}_{n} - \mu } \over { \sigma }}$ exists. By defining a new function $m(t) := E[e^{t(X-\mu)}] = e^{-\mu t} M(t)$, $$ \begin{align*} M(t) =& E \left( e^{ t \sqrt{n} {{ \overline{X}_n - \mu } \over {\sigma}} } \right) \\ =& E \left( e^{ t {{ \sum_{i=1}^{n} X_i - n \mu } \over {\sigma \sqrt{n} }} } \right) \\ =& E \left( e^{ t {{ X_1 - \mu } \over {\sigma \sqrt{n} }} } \right) E \left( e^{ t {{ X_2 - \mu } \over {\sigma \sqrt{n} }} } \right) \cdots E \left( e^{ t {{ X_n - \mu } \over {\sigma \sqrt{n} }} } \right) \\ =& E \left( e^{ t {{ X - \mu } \over {\sigma \sqrt{n} }} } \right) E \left( e^{ t {{ X - \mu } \over {\sigma \sqrt{n} }} } \right) \cdots E \left( e^{ t {{ X - \mu } \over {\sigma \sqrt{n} }} } \right) \\ =& { \left\{ E \left( e^{ t {{ X - \mu } \over {\sigma \sqrt{n} }} } \right) \right\} }^n \\ =& { \left\{ m \left( { {t} \over {\sigma \sqrt{n} } } \right) \right\} } ^{n} \qquad , -h < { {t} \over {\sigma \sqrt{n} } } < h \end{align*} $$

Taylor’s theorem: If the function $f(x)$ is continuous at $[a,b]$ and differentiable up to $n$ times at $(a,b)$, then for $x_{0} \in (a,b)$, there exists $\xi \in (a,b)$ such that $\displaystyle f(x) = \sum_{k=0}^{n-1} {{( x - x_{0} )^{k}\over{ k! }}{f^{(k)}( x_{0} )}} + {(x - x_{0} )^{n}\over{ n! }}{f^{(n)}(\xi)}$ is satisfied.

Applying Taylor’s theorem to $n=2$ reveals that there exists $\xi$ satisfying either $(-t,0)$ or $(0,t)$. Hence, $m(t)$ can be expressed as $$ m(t) = m(0) + m ' (0)t + { {m '' (\xi) t^2} \over {2} } $$ Meanwhile, $$ \begin{cases} m(0)=1 \\ m ' (0) = E(X-\mu) = 0 \\ m '' (0) = E[(X-\mu)^2] = {\sigma}^2 \end{cases} $$ thus $\displaystyle m(t) = 1 + { {m '' (\xi) t^2} \over {2} }$. Here comes the trick: by adding and then subtracting $\displaystyle {{\sigma^2 t^2} \over {2}}$ on the right side, $$ m(t) = 1 + { { \sigma^2 t^2} \over {2} } + { { [ m '' (\xi) - \sigma^2 ] t^2} \over {2} } $$ In other words, $$ M(t) = { \left\{ m \left( { {t} \over {\sigma \sqrt{n} } } \right) \right\} } ^{n} = { \left\{ 1 + { { t^2} \over {2n} } + { { [ m '' (\xi) - \sigma^2 ] t^2} \over {2n \sigma^2 } } \right\} } ^{n} $$

According to Taylor’s theorem, since $\xi$ lies between $\displaystyle \left( -{ {t} \over {\sigma \sqrt{n} } },0 \right)$ and $\displaystyle \left( 0,{ {t} \over {\sigma \sqrt{n} } } \right) $, when $n \to \infty$, $\xi \to 0$; thus, $ m '' (\xi) \to m '' (0) = \sigma^2$. By eliminating the terms that converge to

$$ \lim _{n \to \infty} M(t) = \lim _{n \to \infty} \left( 1 + { { t^2} \over {2n} } \right)^{n} = e^{t^2 / 2} $$

where $e^{t^2 / 2}$ is the moment generating function of the standard normal distribution,

$$ \sqrt{n} {{ \overline{X}_n - \mu } \over {\sigma}} \overset{D}{\to} N (0,1) $$


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): 313~315. ↩︎