logo

Independence and iid of Random Variables 📂Mathematical Statistics

Independence and iid of Random Variables

Definition 1

  1. A random variable $X_{1} , \cdots , X_{n}$ is said to be pairwise independent if it satisfies the following. $$ i \ne j \implies X_{i} \perp X_{j} $$
  2. A continuous random variable $X_{1} , \cdots , X_{n}$ whose joint probability density function $f$ satisfies the condition with respect to each of its probability density functions $f_{1} , \cdots , f_{n}$ is said to be mutually independent. $$ f(x_{1} , \cdots , x_{n} ) \equiv f_{1} (x_{1}) \cdots f_{n} (x_{n}) $$
  3. A discrete random variable $X_{1} , \cdots , X_{n}$ whose joint probability mass function $p$ satisfies the following condition with each of its probability density functions $p_{1} , \cdots , p_{n}$ is said to be mutually independent. $$ p(x_{1} , \cdots , x_{n} ) \equiv p_{1} (x_{1}) \cdots p_{n} (x_{n}) $$
  4. Random variables $X_{1} , \cdots , X_{n}$ that are mutually independent and have the same distribution are called iid (Independent and Identically Distributed).

Explanation

  • The concept of being pairwise independent is less about its own importance but more about conveying the nuance of a less ideal condition that does not satisfy the nice condition of mutual independence. Naturally, mutual independence implies pairwise independence, but not vice versa. A prominent counterexample of this is the Bernstein distributions.
  • iid is a favorite assumption in mathematical statistics since mutual independence is mathematically manageable and each random variable is identically distributed. For example, if the distribution is $D$, then $X_{1} , \cdots , X_{n}$ are said to be iid random variables following distribution $D$, which can also be represented as follows. $$ X_{1} , \cdots , X_{n} \overset{\text{iid}}{\sim} D $$

Theorems

  • [1] Expectation: If $X_{1} , \cdots , X_{n}$ are mutually independent, for some function $u_{1} , \cdots , u_{n}$ applied to each, $$ E \left[ u_{1}(X_{1}) \cdots u_{n}(X_{n}) \right] = E \left[ u_{1}(X_{1}) \right] \cdots E \left[ u_{n}(X_{n}) \right] $$
  • [2] Moment Generating Function: If $X_{1} , \cdots , X_{n}$ are mutually independent and each has a moment-generating function $M_{i}(t) \qquad , -h_{i} < t < h_{i}$, the moment-generating function of their linear combination $\displaystyle T := \sum_{i=1}^{n} a_{i} X_{i}$ is $$ M_{T} (t) = \prod_{i=1}^{n} M_{i} \left( a_{i} t \right) \qquad , -\text{min}_{i=1, \cdots, n} h_{i} < t < \text{min}_{i=1, \cdots, n} h_{i} $$

  1. Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p122~125. ↩︎