logo

Distribution Convergence of Multivariate Random Variables 📂Mathematical Statistics

Distribution Convergence of Multivariate Random Variables

Definition1

When a pp-dimensional random vector X\mathbf{X} and the sequence of random vectors {Xn}\left\{ \mathbf{X}_{n} \right\} satisfies the following condition for nn \to \infty, it is said that Xn\mathbf{X}_{n} converges in distribution to X\mathbf{X}, denoted as XnDX\mathbf{X}_{n} \overset{D}{\to} \mathbf{X}. limnFXn(x)=FX(x),xCFX\lim_{n \to \infty} F_{\mathbf{X}_{n}} (x) = F_{\mathbf{X}} (x) \qquad, \forall x \in C_{F_{\mathbf{X}}}


  • FXF_{X} is the cumulative distribution function of the random variable XX.
  • CFXC_{F_{\mathbf{X}}} represents the set of points where the function FXF_{\mathbf{X}} is continuous.

Multivariate Central Limit Theorem

Let {Xn}\left\{ \mathbf{X}_{n} \right\} be a sequence of iid random vectors with mean vector μRp\mathbf{\mu} \in \mathbb{R}^{p} and covariance matrix ΣRp×p\Sigma \in \mathbb{R}^{p \times p}. Assuming the existence of the moment generating function m(t)m ( \mathbf{t}) in the neighbourhood of the zero vector 0\mathbf{0}, let’s define Yn\mathbf{Y}_{n} as follows. Yn:=1nk=1n(Xkμ)=n(Xμ)\mathbf{Y}_{n} := {{ 1 } \over { \sqrt{n} }} \sum_{k=1}^{n} \left( \mathbf{X}_{k} - \mathbf{\mu} \right) = \sqrt{n} \left( \overline{\mathbf{X}} - \mathbf{\mu} \right) Then, Yn\mathbf{Y}_{n} converges in distribution to the multivariate normal distribution Np(0,Σ)N_{p} \left( \mathbb{0} , \mathbf{\Sigma} \right).

Proof

For tRp\mathbf{t} \in \mathbb{R}^{p} in the vicinity of the zero vector 0\mathbf{0}, the moment generating function of YnY_{n} is as follows. Let Wk:=t(Xμ)W_{k} := \mathbf{t} ' \left( \mathbf{X} - \mathbf{\mu} \right) be Mn(t)=E[exp{t1nk=1n(Xkμ)}]=E[exp{1nk=1nt(Xkμ)}]=E[exp{1nk=1nWk}] \begin{align*} M_{n} \left( \mathbf{t} \right) =& E \left[ \exp \left\{ \mathbf{t}’ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} \left( \mathbf{X}_{k} - \mathbf{\mu} \right) \right\} \right] \\ =& E \left[ \exp \left\{ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} \mathbf{t} ' \left( \mathbf{X}_{k} - \mathbf{\mu} \right) \right\} \right] \\ =& E \left[ \exp \left\{ { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} W_{k} \right\} \right] \end{align*} Where WkW_{k} are iid with mean 00 and variance Var(Wk)=tΣt\operatorname{Var} \left( W_{k} \right) = \mathbf{t} ' \mathbf{\Sigma} \mathbf{t}, hence, according to the univariate central limit theorem, 1nk=1nWkDN(0,tΣt) { { 1 } \over { \sqrt{n} } } \sum_{k=1}^{n} W_{k} \overset{D}{\to} N \left( 0, \mathbf{t} ' \mathbf{\Sigma} \mathbf{t} \right) Then, when nn \to \infty, Mn(t)M_{n} \left( \mathbf{t} \right) is Mn(t)etΣt/2 M_{n} \left( \mathbf{t} \right) \to e^{\mathbf{t} ' \mathbf{\Sigma} \mathbf{t} / 2} This is the moment generating function of the multivariate normal distribution Np(0,Σ)N_{p} \left( \mathbf{0}, \mathbf{\Sigma} \right).


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p322. ↩︎