logo

Convergence in Distribution Implies Probability Bound 📂Mathematical Statistics

Convergence in Distribution Implies Probability Bound

Theorem

A sequence of random variables $\left\{ X_{n} \right\}$ is probabilistically bounded if it converges in distribution.


Explanation

Since we have shown that convergence almost surely implies convergence in distribution, by considering the contrapositive proposition, we can also obtain the common-sense corollary that ‘if it is not probabilistically bounded, it does not converge almost surely’.

Proof

Given $\epsilon>0$ and assuming that $X_{n}$ converges in distribution to random variable $X$ with the cumulative distribution function being $F_{X}$. Then, we can find $\eta_{1}, \eta_{2}$ satisfying $\displaystyle F_{X}(x) > 1- {\epsilon \over 2}$ from $\displaystyle x \ge \eta_{2}$ and $\displaystyle F_{X}(x) < {\epsilon \over 2}$ from $\displaystyle x \le \eta_{1}$. Now, if we set $$ \begin{align*} P[|X|\le \eta] =& F_X (\eta) - F_X (-\eta) \\ \ge& \left( 1 - {\epsilon \over 2} \right) - {\epsilon \over 2} \\ =& 1- \epsilon \end{align*} $$ Considering $X_{n}$ which converges in distribution to $X$, and taking both sides with $\displaystyle \lim_{n \to \infty}$ (that is, continuously choosing sufficiently large $N_{\epsilon}$), from the assumption of convergence in distribution, we get $$ \begin{align*} \lim_{n \to \infty} P[|X_{n}|\le \eta] =& \lim_{n \to \infty} F_{X_{n}} (\eta) - \lim_{n \to \infty} F_{X_{n}} (-\eta) \\ =& F_X (\eta) - F_X (-\eta) \\ \ge& 1 - \epsilon \end{align*} $$ By the definition of probabilistic boundedness, $\left\{ X_{n} \right\}$ is probabilistically bounded.