logo

Distribution Convergence of Multivariate Random Variables 📂Mathematical Statistics

Distribution Convergence of Multivariate Random Variables

Definition1

When a $p$-dimensional random vector $\mathbf{X}$ and the sequence of random vectors $\left\{ \mathbf{X}_{n} \right\}$ satisfies the following condition for $n \to \infty$, it is said that $\mathbf{X}_{n}$ converges in distribution to $\mathbf{X}$, denoted as $\mathbf{X}_{n} \overset{D}{\to} \mathbf{X}$. $$\lim_{n \to \infty} F_{\mathbf{X}_{n}} (x) = F_{\mathbf{X}} (x) \qquad, \forall x \in C_{F_{\mathbf{X}}}$$


  • $F_{X}$ is the cumulative distribution function of the random variable $X$.
  • $C_{F_{\mathbf{X}}}$ represents the set of points where the function $F_{\mathbf{X}}$ is continuous.

Multivariate Central Limit Theorem

Let $\left\{ \mathbf{X}_{n} \right\}$ be a sequence of iid random vectors with mean vector $\mathbf{\mu} \in \mathbb{R}^{p}$ and covariance matrix $\Sigma \in \mathbb{R}^{p \times p}$. Assuming the existence of the moment generating function $m ( \mathbf{t})$ in the neighbourhood of the zero vector $\mathbf{0}$, let’s define $\mathbf{Y}_{n}$ as follows. $$\mathbf{Y}_{n} := {{ 1 } \over { \sqrt{n} }} \sum_{k=1}^{n} \left( \mathbf{X}_{k} - \mathbf{\mu} \right) = \sqrt{n} \left( \overline{\mathbf{X}} - \mathbf{\mu} \right)$$ Then, $\mathbf{Y}_{n}$ converges in distribution to the multivariate normal distribution $N_{p} \left( \mathbb{0} , \mathbf{\Sigma} \right)$.

Proof

For $\mathbf{t} \in \mathbb{R}^{p}$ in the vicinity of the zero vector $\mathbf{0}$, the moment generating function of $Y_{n}$ is as follows. Let $W_{k} := \mathbf{t} ' \left( \mathbf{X} - \mathbf{\mu} \right)$ be $$ \begin{align*} M_{n} \left( \mathbf{t} \right) =& E \left[ \exp \left\{ \mathbf{t}’ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} \left( \mathbf{X}_{k} - \mathbf{\mu} \right) \right\} \right] \\ =& E \left[ \exp \left\{ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} \mathbf{t} ' \left( \mathbf{X}_{k} - \mathbf{\mu} \right) \right\} \right] \\ =& E \left[ \exp \left\{ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} W_{k} \right\} \right] \end{align*} $$ Where $W_{k}$ are iid with mean $0$ and variance $\operatorname{Var} \left( W_{k} \right) = \mathbf{t} ' \mathbf{\Sigma} \mathbf{t}$, hence, according to the univariate central limit theorem, $$ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} W_{k} \overset{D}{\to} N \left( 0, \mathbf{t} ' \mathbf{\Sigma} \mathbf{t} \right) $$ Then, when $n \to \infty$, $M_{n} \left( \mathbf{t} \right)$ is $$ M_{n} \left( \mathbf{t} \right) \to e^{\mathbf{t} ' \mathbf{\Sigma} \mathbf{t} / 2} $$ This is the moment generating function of the multivariate normal distribution $N_{p} \left( \mathbf{0}, \mathbf{\Sigma} \right)$.


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p322. ↩︎