logo

Convergence in Distribution Implies Probability Bound 📂Mathematical Statistics

Convergence in Distribution Implies Probability Bound

Theorem

A sequence of random variables {Xn}\left\{ X_{n} \right\} is probabilistically bounded if it converges in distribution.


Explanation

Since we have shown that convergence almost surely implies convergence in distribution, by considering the contrapositive proposition, we can also obtain the common-sense corollary that ‘if it is not probabilistically bounded, it does not converge almost surely’.

Proof

Given ϵ>0\epsilon>0 and assuming that XnX_{n} converges in distribution to random variable XX with the cumulative distribution function being FXF_{X}. Then, we can find η1,η2\eta_{1}, \eta_{2} satisfying FX(x)>1ϵ2\displaystyle F_{X}(x) > 1- {\epsilon \over 2} from xη2\displaystyle x \ge \eta_{2} and FX(x)<ϵ2\displaystyle F_{X}(x) < {\epsilon \over 2} from xη1\displaystyle x \le \eta_{1}. Now, if we set P[Xη]=FX(η)FX(η)(1ϵ2)ϵ2=1ϵ \begin{align*} P[|X|\le \eta] =& F_X (\eta) - F_X (-\eta) \\ \ge& \left( 1 - {\epsilon \over 2} \right) - {\epsilon \over 2} \\ =& 1- \epsilon \end{align*} Considering XnX_{n} which converges in distribution to XX, and taking both sides with limn\displaystyle \lim_{n \to \infty} (that is, continuously choosing sufficiently large NϵN_{\epsilon}), from the assumption of convergence in distribution, we get limnP[Xnη]=limnFXn(η)limnFXn(η)=FX(η)FX(η)1ϵ \begin{align*} \lim_{n \to \infty} P[|X_{n}|\le \eta] =& \lim_{n \to \infty} F_{X_{n}} (\eta) - \lim_{n \to \infty} F_{X_{n}} (-\eta) \\ =& F_X (\eta) - F_X (-\eta) \\ \ge& 1 - \epsilon \end{align*} By the definition of probabilistic boundedness, {Xn}\left\{ X_{n} \right\} is probabilistically bounded.