logo

Probability Variables Independence in Mathematical Statistics 📂Mathematical Statistics

Probability Variables Independence in Mathematical Statistics

Definition 1

If for two random variables X1,X2X_{1}, X_{2}, the joint probability density function ff or the probability mass function pp satisfies the following conditions for the probability density functions f1,f2f_{1}, f_{2} or the probability mass functions p1,p2p_{1}, p_{2} of X1,X2X_{1}, X_{2}, then X1,X2X_{1}, X_{2} are said to be independent, and is denoted as X1X2X_{1} \perp X_{2}. f(x1,x2)f1(x1)f2(x2)p(x1,x2)p1(x1)p2(x2) f(x_{1} , x_{2} ) \equiv f_{1}(x_{1})f_{2}(x_{2}) \\ p(x_{1} , x_{2} ) \equiv p_{1}(x_{1})p_{2}(x_{2})

Theorem

The following are equivalent for continuous random variables, though the statement also holds for discrete random variables for convenience.

The following are all equivalent:

  • [1]: X1X2X_{1} \perp X_{2}
  • [2] Probability density function: f(x1,x2)f1(x1)f2(x2) f (x_{1} , x_{2}) \equiv f_{1}(x_{1}) f_{2}(x_{2})
  • [3] Cumulative distribution function: For all (x1,x2)R2(x_{1} ,x_{2}) \in \mathbb{R}^{2} F(x1,x2)=F1(x1)F2(x2) F (x_{1} , x_{2}) = F_{1}(x_{1}) F_{2}(x_{2})
  • [4] Probability: For all constants a<ba<b and c<dc < d P(a<X1b,c<X2d)=P(a<X1b)P(c<X2d) P(a < X_{1} \le b, c < X_{2} \le d) = P(a < X_{1} \le b) P ( c < X_{2} \le d)
  • [5] Expectation: If E[u(X1)]E \left[ u (X_{1}) \right] and E[u(X2)]E \left[ u (X_{2}) \right] exist E[u(X1)u(X2)]=E[u(X1)]E[u(X2)] E \left[ u(X_{1}) u(X_{2}) \right] = E \left[ u (X_{1}) \right] E \left[ u (X_{2}) \right]
  • [6] Moment generating function: If the joint moment generating function M(t1,t2)M(t_{1} , t_{2}) exists M(t1,t2)=M(t1,0)M(0,t2) M(t_{1} , t_{2}) = M (t_{1} , 0 ) M( 0, t_{2} )

Explanation

As one can see from the forms of equivalent conditions above, independence means the condition that the entangled (joint) functions can be separated into a product form. This can be seen as an abstraction of the independence of events, which can separate probabilities as P(AB)=P(AB)P(B)    indP(AB)=P(AB)P(B)=P(A)P(B) P(A \mid B) = {{ P(A B) } \over { P(B) }} \overset{\text{ind}}{\implies} P(AB) = P(A \mid B) P(B) = P(A) P(B) Understanding independence intuitively is important, but in studying mathematical statistics, it is necessary to pay more attention to its mathematical form.

See Also


  1. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p112. ↩︎