Convergence in Distribution Implies Probability Bound
📂Mathematical StatisticsConvergence in Distribution Implies Probability Bound
Theorem
A sequence of random variables {Xn} is probabilistically bounded if it converges in distribution.
Explanation
Since we have shown that convergence almost surely implies convergence in distribution, by considering the contrapositive proposition, we can also obtain the common-sense corollary that ‘if it is not probabilistically bounded, it does not converge almost surely’.
Proof
Given ϵ>0 and assuming that Xn converges in distribution to random variable X with the cumulative distribution function being FX. Then, we can find η1,η2 satisfying FX(x)>1−2ϵ from x≥η2 and FX(x)<2ϵ from x≤η1. Now, if we set
P[∣X∣≤η]=≥=FX(η)−FX(−η)(1−2ϵ)−2ϵ1−ϵ
Considering Xn which converges in distribution to X, and taking both sides with n→∞lim (that is, continuously choosing sufficiently large Nϵ), from the assumption of convergence in distribution, we get
n→∞limP[∣Xn∣≤η]==≥n→∞limFXn(η)−n→∞limFXn(−η)FX(η)−FX(−η)1−ϵ
By the definition of probabilistic boundedness, {Xn} is probabilistically bounded.
■