Independence and iid of Random Variables
Definition 1
- A random variable is said to be pairwise independent if it satisfies the following.
- A continuous random variable whose joint probability density function satisfies the condition with respect to each of its probability density functions is said to be mutually independent.
- A discrete random variable whose joint probability mass function satisfies the following condition with each of its probability density functions is said to be mutually independent.
- Random variables that are mutually independent and have the same distribution are called iid (Independent and Identically Distributed).
Explanation
- The concept of being pairwise independent is less about its own importance but more about conveying the nuance of a less ideal condition that does not satisfy the nice condition of mutual independence. Naturally, mutual independence implies pairwise independence, but not vice versa. A prominent counterexample of this is the Bernstein distributions.
- iid is a favorite assumption in mathematical statistics since mutual independence is mathematically manageable and each random variable is identically distributed. For example, if the distribution is , then are said to be iid random variables following distribution , which can also be represented as follows.
Theorems
- [1] Expectation: If are mutually independent, for some function applied to each,
- [2] Moment Generating Function: If are mutually independent and each has a moment-generating function , the moment-generating function of their linear combination is
Hogg et al. (2013). Introduction to Mathematical Statistics (7th Edition): p122~125. ↩︎