Properties of Mean and Variance
Theorem
The mean $E ( X ) = \mu_{X}$ and variance $\operatorname{Var} (X) = E [ ( X - \mu_{X} )^2 ]$ have the following properties:
- [1]: $E(X + Y) = E(X) + E(Y)$
- [2]: $E(aX + b) = a E(X) + b$
- [3]: $\operatorname{Var} (X) \ge 0$
- [4]: $\operatorname{Var} ( X ) = E(X^2) - \mu_{X}^2$
- [5]: $\operatorname{Var} (aX + b) = a^2 \operatorname{Var} (X)$
Explanation
As they relate to mean and variance, these are very important properties. Specifically, [1] and [2] are properties known as Linearity, which make handling equations very convenient.
Proof
[1]
For discrete cases $$ \begin{align*} E ( X + Y ) =& \sum (xp(x) + yp(y) ) \\ =& \sum xp(x) + \sum yp(y) \\ =& E(X) + E(Y) \end{align*} $$ For continuous cases $$ \begin{align*} E ( X + Y ) =& \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} (x + y) f(x,y) dx dy \\ =& \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} x f(x,y) dx dy + \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} y f(x,y) dx dy \\ =& E(X) + E(Y) \end{align*} $$
■
[2]
For discrete cases $$ \begin{align*} E ( aX + b ) =& \sum \left( a x p(x) + b p(x) \right) \\ =& a \sum x p(x) + b \sum p(y) \\ =& a E(X) + b \end{align*} $$ For continuous cases $$ \begin{align*} E ( aX + b ) =& \int_{-\infty}^{\infty} (ax+b) f(x) dx \\ =& \int_{-\infty}^{\infty} a xf(x) dx + \int_{-\infty}^{\infty} b f(x) dx \\ =& a \int_{-\infty}^{\infty} xf(x) dx + b \int_{-\infty}^{\infty} f(x) dx \\ =& a E(X) + b \end{align*} $$
■
[3]
Since $( X - \mu_{X} )^2 \ge 0$, then $\operatorname{Var} (X) = E [ ( X - \mu_{X} )^2 ] \ge 0$
■
[4]
$$ \begin{align*} \operatorname{Var} (X) =& E [ ( X - \mu_{X} )^2 ] \\ =& E (X^2 - 2 \mu_{X} X + \mu_{X}^2 ) \\ =& E (X^2) - 2 \mu_{X} E(X) + \mu_{X}^2 \\ =& E(X^2) - \mu_{X}^2 \end{align*} $$
■
[5]
According to theorem [2], if $Y = a X + b$ then $\mu_{Y} = a \mu_{X} + b$ and $$ \begin{align*} \operatorname{Var} (Y) =& E [ ( Y - \mu_{Y} )^2 ] \\ =& E [ ( aX + b - a \mu_{X} - b )^2 ] \\ =& E [ a^2 ( X - \mu_{X} )^2 ] \\ =& a^2 \operatorname{Var} (X) \end{align*} $$
■