Chebyshev's Inequality Proof
Theorem 1
If the variance $\sigma^2 < \infty$ of a random variable $X$ exists for some positive number $k>0$, then $$ P(|X-\mu| \ge k\sigma) \le {1 \over k^2} $$
Explanation
It is relatively simple in form and easy to manipulate, and the results are immediately apparent, making it widely used as a lemma. However, compared to Markov’s inequality, there is one more condition that the variance must exist.
One might think the condition that the $2$th moment must exist is too obvious and trivial. To some extent, this is true, but it’s worth noting that its existence is not so obvious, especially for undergraduates.
Proof
Strategy: Start with Markov’s inequality and utilize that an inequality involving squares can be converted to an inequality involving absolute values without changing its truth. Since the variance exists by assumption, there is no need to prove the existence of mean $\mu$.
Let $u(X) : =(X-\mu)^2$.
Markov’s inequality $$ P(u(X) \ge c) \le {E(u(X)) \over c} $$
If $c:=k^2 \sigma^2$, then $$ P((X-\mu)^2 \ge k^2 \sigma ^2) \le {E((X-\mu)^2) \over {k^2 \sigma^2}} $$ Since $P((X-\mu)^2 \ge k^2 \sigma ^2) = P(|X-\mu| \ge k \sigma)$ and $E((X-\mu)^2)=\sigma^2$, $$ P(|X-\mu| \ge k \sigma) \le {1 \over k^2} $$
■
Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p69. ↩︎