logo

Independence and iid of Random Variables 📂Mathematical Statistics

Independence and iid of Random Variables

Definition 1

  1. A random variable X1,,XnX_{1} , \cdots , X_{n} is said to be pairwise independent if it satisfies the following. ij    XiXj i \ne j \implies X_{i} \perp X_{j}
  2. A continuous random variable X1,,XnX_{1} , \cdots , X_{n} whose joint probability density function ff satisfies the condition with respect to each of its probability density functions f1,,fnf_{1} , \cdots , f_{n} is said to be mutually independent. f(x1,,xn)f1(x1)fn(xn) f(x_{1} , \cdots , x_{n} ) \equiv f_{1} (x_{1}) \cdots f_{n} (x_{n})
  3. A discrete random variable X1,,XnX_{1} , \cdots , X_{n} whose joint probability mass function pp satisfies the following condition with each of its probability density functions p1,,pnp_{1} , \cdots , p_{n} is said to be mutually independent. p(x1,,xn)p1(x1)pn(xn) p(x_{1} , \cdots , x_{n} ) \equiv p_{1} (x_{1}) \cdots p_{n} (x_{n})
  4. Random variables X1,,XnX_{1} , \cdots , X_{n} that are mutually independent and have the same distribution are called iid (Independent and Identically Distributed).

Explanation

  • The concept of being pairwise independent is less about its own importance but more about conveying the nuance of a less ideal condition that does not satisfy the nice condition of mutual independence. Naturally, mutual independence implies pairwise independence, but not vice versa. A prominent counterexample of this is the Bernstein distributions.
  • iid is a favorite assumption in mathematical statistics since mutual independence is mathematically manageable and each random variable is identically distributed. For example, if the distribution is DD, then X1,,XnX_{1} , \cdots , X_{n} are said to be iid random variables following distribution DD, which can also be represented as follows. X1,,XniidD X_{1} , \cdots , X_{n} \overset{\text{iid}}{\sim} D

Theorems

  • [1] Expectation: If X1,,XnX_{1} , \cdots , X_{n} are mutually independent, for some function u1,,unu_{1} , \cdots , u_{n} applied to each, E[u1(X1)un(Xn)]=E[u1(X1)]E[un(Xn)] E \left[ u_{1}(X_{1}) \cdots u_{n}(X_{n}) \right] = E \left[ u_{1}(X_{1}) \right] \cdots E \left[ u_{n}(X_{n}) \right]
  • [2] Moment Generating Function: If X1,,XnX_{1} , \cdots , X_{n} are mutually independent and each has a moment-generating function Mi(t),hi<t<hiM_{i}(t) \qquad , -h_{i} < t < h_{i}, the moment-generating function of their linear combination T:=i=1naiXi\displaystyle T := \sum_{i=1}^{n} a_{i} X_{i} is MT(t)=i=1nMi(ait),mini=1,,nhi<t<mini=1,,nhi M_{T} (t) = \prod_{i=1}^{n} M_{i} \left( a_{i} t \right) \qquad , -\text{min}_{i=1, \cdots, n} h_{i} < t < \text{min}_{i=1, \cdots, n} h_{i}

  1. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p122~125. ↩︎