Convergence of Probabilities Defined by Measure Theory
Probability Convergence Defined Rigorously
Given a probability space .
A sequence of random variables is said to converge in probability to a random variable if it converges in measure to , denoted as .
- If you’re not yet familiar with measure theory, the term probability space can be disregarded.
Explanation
The convergence of to means, for all , and in a more familiar form, it can be shown as follows: Since a sequence of random variables is a stochastic process, it is inferred to be useful in the theory of stochastic processes.
Properties of probability convergence from measure convergence:
- [3] If converges to almost surely, it implies probability convergence.
- [4] If converges to in convergence, it implies probability convergence.
Since probability is a measure, it inherits the properties of measure convergence.
See Also
- Probability Convergence Defined in Mathematical Statistics
- Almost Sure Convergence Probability Convergence Distribution Convergence
- Convergence Probability Convergence