Probability Variables Independence in Mathematical Statistics
Definition 1
If for two random variables $X_{1}, X_{2}$, the joint probability density function $f$ or the probability mass function $p$ satisfies the following conditions for the probability density functions $f_{1}, f_{2}$ or the probability mass functions $p_{1}, p_{2}$ of $X_{1}, X_{2}$, then $X_{1}, X_{2}$ are said to be independent, and is denoted as $X_{1} \perp X_{2}$. $$ f(x_{1} , x_{2} ) \equiv f_{1}(x_{1})f_{2}(x_{2}) \\ p(x_{1} , x_{2} ) \equiv p_{1}(x_{1})p_{2}(x_{2}) $$
Theorem
The following are equivalent for continuous random variables, though the statement also holds for discrete random variables for convenience.
The following are all equivalent:
- [1]: $X_{1} \perp X_{2}$
- [2] Probability density function: $$ f (x_{1} , x_{2}) \equiv f_{1}(x_{1}) f_{2}(x_{2}) $$
- [3] Cumulative distribution function: For all $(x_{1} ,x_{2}) \in \mathbb{R}^{2}$ $$ F (x_{1} , x_{2}) = F_{1}(x_{1}) F_{2}(x_{2}) $$
- [4] Probability: For all constants $a<b$ and $c < d$ $$ P(a < X_{1} \le b, c < X_{2} \le d) = P(a < X_{1} \le b) P ( c < X_{2} \le d) $$
- [5] Expectation: If $E \left[ u (X_{1}) \right]$ and $E \left[ u (X_{2}) \right]$ exist $$ E \left[ u(X_{1}) u(X_{2}) \right] = E \left[ u (X_{1}) \right] E \left[ u (X_{2}) \right] $$
- [6] Moment generating function: If the joint moment generating function $M(t_{1} , t_{2})$ exists $$ M(t_{1} , t_{2}) = M (t_{1} , 0 ) M( 0, t_{2} ) $$
Explanation
As one can see from the forms of equivalent conditions above, independence means the condition that the entangled (joint) functions can be separated into a product form. This can be seen as an abstraction of the independence of events, which can separate probabilities as $$ P(A \mid B) = {{ P(A B) } \over { P(B) }} \overset{\text{ind}}{\implies} P(AB) = P(A \mid B) P(B) = P(A) P(B) $$ Understanding independence intuitively is important, but in studying mathematical statistics, it is necessary to pay more attention to its mathematical form.
See Also
Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p112. ↩︎