logo

Expectation of Random Vectors 📂Mathematical Statistics

Expectation of Random Vectors

Definition 1

$$ E \left( X \right) := \begin{bmatrix} E \left( X_{1} \right) \\ \vdots \\ E \left( X_{n} \right) \end{bmatrix} $$ The expectation of a random vector $X = \left( X_{1} , \cdots , X_{n} \right)$ is defined as a vector of the expectations of its components, as shown above. Similarly, the matrix $\mathbf{X} = \left[ X_{ij} \right]$ of a random variable of size $m \times n$ is also defined as a matrix $E \left( \mathbf{X} \right) := \left[ E \left( X_{ij} \right) \right]$ that contains the expectation of each element.

Properties

  • [1] Linearity: If $\mathbf{X}_{1}$ and $\mathbf{X}_{2}$ are random matrices of size $m \times n$ and constant matrices $A_{1}, A_{2} \in \mathbb{R}^{k \times m}$ and $B \in \mathbb{R}^{n \times l}$ are given, then the following holds true: $$ \begin{align*} E \left( A_{1} \mathbf{X}_{1} + A_{2} \mathbf{X}_{2} \right) =& A_{1} E \left( \mathbf{X}_{1} \right) + A_{2} E \left( \mathbf{X}_{2} \right) \\ E \left( A_{1} \mathbf{X}_{1} B \right) =& A_{1} E \left( X_{1} \right) B \end{align*} $$
  • [2] Trace: $E(\tr(\mathbf{X})) = \tr(E(\mathbf{X}))$

Proof

[1]

We will only show $E \left( A \mathbf{X} \right) = A E \left( \mathbf{X} \right)$, and omit the rest.

Let’s denote $A = \begin{bmatrix} a_{ik}\end{bmatrix}$ as a $m \times p$ matrix and $\mathbf{X} = \begin{bmatrix} X_{kj}\end{bmatrix}$ as a $p \times n$ matrix. Then, by the definition of matrix multiplication and the expectation of matrices,

$$ \begin{align*} E(A \mathbf{X}) &= E \left( \begin{bmatrix} \sum\limits_{k=1}^{p} a_{ik}X_{kj} \end{bmatrix} \right) \\ &= \begin{bmatrix} E \left( \sum\limits_{k=1}^{p} a_{ik}X_{kj} \right) \end{bmatrix} \\ &= \begin{bmatrix} \sum\limits_{k=1}^{p} a_{ik} E \left( X_{kj} \right) \end{bmatrix} & \text{by linearity of $E$} \\ &= A E(\mathbf{X}) \end{align*} $$

[2]

$\mathbf{X} = \begin{bmatrix} X_{ij} \end{bmatrix}$를 $n \times n$ 행렬이라고 하자.

$$ \begin{align*} E(\tr(A)) &= E \left( \sum\limits_{i=1}^{n} X_{ii} \right) \\ &= \sum\limits_{i=1}^{n} E(X_{ii}) & \text{by linearity of $E$} \\ &= \tr \begin{bmatrix} E(X_{11}) & \cdots & E(X_{1n}) \\ \vdots & \ddots & \vdots \\ E(X_{n1}) & \cdots & E(X_{nn}) \end{bmatrix} & \text{by definition of trace} \\ &= \tr\left( E(\mathbf{X}) \right) \end{align*} $$


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p125. ↩︎