logo

Proof of the Monotone Convergence Theorem for Conditional Cases 📂Probability Theory

Proof of the Monotone Convergence Theorem for Conditional Cases

Theorem

Let’s assume that a probability space $( \Omega , \mathcal{F} , P)$ is given.

Considering the sequence of random variables $\left\{ X_{n} \right\}_{n \in \mathbb{N}}$ and $X \in \mathcal{L}^{1} (\Omega)$, we have $$ X_{1} \le X_{2} \le \cdots \le X \\ X_{n} \to X \text{ a.s.} $$ then $$ \lim_{n \to \infty} E( X_{n} | \mathcal{G} ) = E( \lim_{n \to \infty} X_{n} | \mathcal{G} ) \text{ a.s.} $$


Explanation

The conditional monotone convergence theorem simply states that the monotone convergence theorem applies to conditional expectations just as well. Its role in probability theory is the same as that of MCT.

Proof

Strategy: Use the monotone convergence theorem to sandwich $\displaystyle \int$ around $\displaystyle \lim_{n \to \infty}$, and by tweaking the definition of expectation to add and remove $E$, make the integrands the same.


Part 1. $X_{1} \ge 0$

According to the monotone convergence theorem, for all $A \in \mathcal{G}$ $$ \begin{align*} \int_{A} \lim_{n \to \infty} E( X_{n} | \mathcal{G} ) dP \color{red}{=}& \lim_{n \to \infty} \int_{A} E( X_{n} | \mathcal{G} ) dP \\ =& \lim_{n \to \infty} \int_{A} X_{n} dP \\ \color{red}{=}& \int_{A} \lim_{n \to \infty} X_{n} dP \\ =& \int_{A} E( \lim_{n \to \infty} X_{n} | \mathcal{G} ) d P \end{align*} $$ So, $ $\displaystyle \forall A \in \mathcal{F}, \int_{A} f dm = 0 \iff f = 0 \text{ a.e.}$ 이므로 $$ \lim_{n \to \infty} E( X_{n} | \mathcal{G} ) = E( \lim_{n \to \infty} X_{n} | \mathcal{G} ) \text{ a.s.} $$


Part 2. $X_{1} < 0$

$Y_{n} := X_{n} - X_{1}$ 이 확률 변수 $Y = X - X_{1}$ 에 대해 $Y_{n} \nearrow Y$ 라고 하면 $Y_{1} \ge 0$, therefore, according to Part 1. $$ \lim_{n \to \infty} E( Y_{n} | \mathcal{G} ) = E( \lim_{n \to \infty} Y_{n} | \mathcal{G} ) \text{ a.s.} $$ Then, by the linearity of conditional expectation, we obtain the following. $$ \begin{align*} \lim_{n \to \infty} E( X_{n} | \mathcal{G} ) =& \lim_{n \to \infty} E( Y_{n} + X_{1} | \mathcal{G} ) \\ =& \lim_{n \to \infty} E( Y_{n} | \mathcal{G} ) + E( X_{1} | \mathcal{G} ) \\ =& E( X - X_{1} | \mathcal{G} ) + E( X_{1} | \mathcal{G} ) \\ =& E( X - X_{1} + X_{1} | \mathcal{G} ) \\ =& E( \lim_{n \to \infty} X_{n} | \mathcal{G} ) \text{ a.s.} \end{align*} $$

See Also