Convergence of Probabilities Defined by Measure Theory
Probability Convergence Defined Rigorously
Given a probability space $( \Omega , \mathcal{F} , P)$.
A sequence of random variables $\left\{ X_{n} \right\}_{n \in \mathbb{N}}$ is said to converge in probability to a random variable $X$ if it converges in measure to $X$, denoted as $X_{n} \overset{P}{\to} X$.
- If you’re not yet familiar with measure theory, the term probability space can be disregarded.
Explanation
The convergence of $\left\{ X_{n} \right\}_{n \in \mathbb{N}}$ to $X$ means, for all $\varepsilon > 0$, $$ \lim_{n \to \infty} P \left( \left\{ \omega \in \Omega : | X_{n}(\omega) - X(\omega) | \ge \varepsilon \right\} \right) = 0 $$ and in a more familiar form, it can be shown as follows: $$ \lim_{n \to \infty} P \left( | X_{n}(\omega) - X(\omega) | < \varepsilon \right) = 1 $$ Since a sequence of random variables is a stochastic process, it is inferred to be useful in the theory of stochastic processes.
Properties of probability convergence from measure convergence:
- [3] If $X_{n}$ converges to $X$ almost surely, it implies probability convergence.
- [4] If $X_{n}$ converges to $X$ in $\mathcal{L}_{p}$ convergence, it implies probability convergence.
Since probability $P$ is a measure, it inherits the properties of measure convergence.
See Also
- Probability Convergence Defined in Mathematical Statistics
- Almost Sure Convergence $\implies$ Probability Convergence $\implies$ Distribution Convergence
- $\mathcal{L}_{p}$ Convergence $\implies$ Probability Convergence