Convergence in Probability Implies Convergence in Distribution
📂Mathematical StatisticsConvergence in Probability Implies Convergence in Distribution
Theorem
Given a random variable X and its sequence {Xn},
Xn→PX⟹Xn→DX
Explanation
In simpler terms, it’s much easier to have convergence in distribution than exact convergence. Understanding a random variable as a function in its own right should make this concept not too difficult to grasp.
Proof
Strategy: A trick of diving events into two parts and setting up inequalities is used, and it’s recommended to get used to this as you’ll encounter it frequently in your studies. It’s okay if it’s hard at first, understanding doesn’t always come immediately. Don’t get discouraged – try reading through multiple times to grasp the concept.
For any arbitrary positive number ϵ>0,
FXn(x)==P[Xn≤x]P[{Xn≤x}∩{∣Xn−X∣<ϵ}]+P[{Xn≤x}∩{∣Xn−X∣≥ϵ}]
Looking closely at the first term P[{Xn≤x}∩{∣Xn−X∣<ϵ}],
⟹⟹⟹∣Xn−X∣<ϵX<Xn+ϵX<Xn+ϵ≤x+ϵX<x+ϵ
While for the second term,
P[{Xn≤x}∩{∣Xn−X∣≥ϵ}]≤P[{∣Xn−X∣≥ϵ}]
Organizing yields,
FXn(x)≤=P[X≤x+ϵ]+P[{∣Xn−X∣≥ϵ}]FX(x+ϵ)+P[{∣Xn−X∣≥ϵ}]
Taking the limit n→∞lim on both sides gives n→∞limP[{∣Xn−X∣≥ϵ}=0, hence
n→∞limsupFXn(x)≤FX(x+ϵ)
Having determined an upper bound, finding a lower bound by the same method shows
FX(x−ϵ)≤n→∞liminfFXn(x)≤n→∞limsupFXn(x)≤FX(x+ϵ)
Since ϵ is any arbitrary positive number, for ϵ→0, at every continuous point x∈CFX,
n→∞limFXn(x)=FX(x)
thus, Xn converges in distribution to X.
■