Probability Variables Independence in Mathematical Statistics
📂Mathematical StatisticsProbability Variables Independence in Mathematical Statistics
Definition
If for two random variables X1,X2, the joint probability density function f or the probability mass function p satisfies the following conditions for the probability density functions f1,f2 or the probability mass functions p1,p2 of X1,X2, then X1,X2 are said to be independent, and is denoted as X1⊥X2.
f(x1,x2)≡f1(x1)f2(x2)p(x1,x2)≡p1(x1)p2(x2)
Theorem
The following are equivalent for continuous random variables, though the statement also holds for discrete random variables for convenience.
The following are all equivalent:
- [1]: X1⊥X2
- [2] Probability density function:
f(x1,x2)≡f1(x1)f2(x2)
- [3] Cumulative distribution function: For all (x1,x2)∈R2
F(x1,x2)=F1(x1)F2(x2)
- [4] Probability: For all constants a<b and c<d
P(a<X1≤b,c<X2≤d)=P(a<X1≤b)P(c<X2≤d)
- [5] Expectation: If E[u(X1)] and E[u(X2)] exist
E[u(X1)u(X2)]=E[u(X1)]E[u(X2)]
- [6] Moment generating function: If the joint moment generating function M(t1,t2) exists
M(t1,t2)=M(t1,0)M(0,t2)
Explanation
As one can see from the forms of equivalent conditions above, independence means the condition that the entangled (joint) functions can be separated into a product form. This can be seen as an abstraction of the independence of events, which can separate probabilities as
P(A∣B)=P(B)P(AB)⟹indP(AB)=P(A∣B)P(B)=P(A)P(B)
Understanding independence intuitively is important, but in studying mathematical statistics, it is necessary to pay more attention to its mathematical form.
See Also