Proof of the Weak Law of Large Numbers
Law
Given $\left\{ X_{k} \right\}_{k=1}^{n}$ are iid random variables with distribution $\left( \mu, \sigma^2 \right) $, then when $n \to \infty$ $$ \overline{X}_n \overset{P}{\to} \mu $$
- $\overset{P}{\to}$ means convergence in probability.
Explanation
Ranked among the most important theorems in statistics, alongside the Central Limit Theorem.
This theorem implies that no matter the distribution, ’the sample mean converges to the population mean’. While this might seem obvious, in natural sciences, the word ‘obvious’ is seldom as significant. Beyond its application, the term ’law’ is fitting for such a proposition, especially from the perspective of those pursuing academia.
Fortunately, despite its significance, the proof itself is simple, requiring only an understanding of the definition of probability convergence and the Chebyshev’s inequality.
Proof1
A trick for using Chebyshev’s inequality will be employed.
For all $\epsilon > 0$ $$ P(|\overline{X}_n - \mu | \ge \epsilon) = P\left(|\overline{X}_n - \mu | \ge \left( \epsilon \sqrt{n} \over \sigma \right) \left( \sigma \over \sqrt{n} \right) \right) $$
Here, the population mean and variance of $\overline{X}_n$ are respectively $\mu$ and $\displaystyle \sigma^2 / n $. We have already met the conditions to use Chebyshev’s inequality.
Chebyshev’s inequality $$ P(| X - \mu | \ge k\sigma ) \le {1 \over k^2} $$
Therefore, when $n \to \infty$ $$ P(|\overline{X}_n - \mu | \ge \epsilon) = P\left(|\overline{X}_n - \mu | \ge \left( \epsilon \sqrt{n} \over \sigma \right) {\sigma \over \sqrt{n}} \right) \le {{\sigma^2} \over {n \epsilon^2}} \to 0 $$
■
Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p296. ↩︎