logo

Markov Inequality Proof 📂Lemmas

Markov Inequality Proof

Theorem 1

Let’s define a function $u(X) \ge 0$ for the random variable $X$. If $E \left( u(X) \right)$ exists, then for $c > 0$ $$ P(u(X) \ge c) \le {E \left( u(X) \right) \over c} $$

Explanation

There is the Chebyshev’s inequality, which makes it more convenient to use as an auxiliary lemma in numerous proofs.

You might consider the condition that the $1$th moment must exist as too easy and obvious. While it’s somewhat true, one should at least be aware of the fact that its existence is not so obvious if you are at least an undergraduate.

Proof

Strategy: Divide the integration range based on $c$ into two parts and simplify it using only the relationship of inequality. This proof applies to continuous probability distributions, but the same method can also prove discrete probability distributions.


Let’s define the set $A := \left\{ x : u(x) \ge c \right\}$ and the probability density function $f$ of the random variable $X$.

Since $\mathbb{R} = A \cup A^c$, $$ E(u(X)) = \int _{-\infty} ^{\infty} u(x)f(x)dx = \int _{A} u(x)f(x)dx + \int _{A^c} u(x)f(x)dx $$ If $u(x)f(x) \ge 0$, then $\displaystyle \int _{A^c} u(x)f(x)dx \ge 0$, $$ E(u(X)) \ge \int _{A} u(x)f(x)dx $$ Since $u(x) \ge c$, $$ E(u(X)) \ge c \int _{A} f(x)dx $$ Since $\displaystyle \int _{A} f(x)dx = P(X \in A) = P(u(X) \ge c)$, $$ E(u(X)) \ge c P(u(X) \ge c) $$ Dividing both sides by $c$ gives $$ {E(u(X)) \over c} \ge P(u(X) \ge c) $$


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p68. ↩︎