Mean and Variance of the Hypergeometric Distribution
Formula
Random variable $X$ follows the hypergeometric distribution $X \sim \operatorname{HG}(N, D, n)$. Then its mean and variance for $p := D / N$ are given by: $$ \begin{align*} E (X) =& n \frac{D}{N} = n p \\ \Var (X) =& n {\frac{ D }{ N }} {\frac{ N - D }{ N }} {\frac{ N - n }{ N - 1 }} = np(1 - p) \frac{N - n}{N - 1} \end{align*} $$
Derivation
Mean 1
Binomial coefficient subtraction identity: $$ \binom{m}{x} \left( {\frac{ m }{ x }} \right) = \binom{m-1}{x-1} $$
For the binomial coefficient, the following holds. $$ \begin{align*} x \binom{m}{x} =& m \binom{m-1}{x-1} \\ \binom{m}{x} =& {\frac{ m }{ x }} \binom{m-1}{x-1} \end{align*} $$
Definition of the hypergeometric distribution: For natural numbers $n, N, D \in \mathbb{N}$, the following probability mass function defines a discrete probability distribution called the hypergeometric distribution. $$ p(x) = {\frac{ \binom{D}{x} \binom{N - D}{n - x} }{ \binom{N}{n} }} \qquad , x \in 0, 1, \cdots , n $$
In the expansion below we will use the variable substitution $y := x - 1$. $$ \begin{align*} E (X) =& \sum_{x=0}^{n} x p(x) \\ =& \sum_{x=0}^{n} x {\frac{ \binom{D}{x} \binom{N - D}{n - x} }{ \binom{N}{n} }} \\ =& 0 + \sum_{x=1}^{n} x {\frac{ \binom{D}{x} \binom{N - D}{n - x} }{ \binom{N}{n} }} \\ =& {\frac{ D n }{ N }} \sum_{x=1}^{n} { \frac{ \binom{D-1}{x-1} \binom{N - D}{n - x} }{ \binom{N-1}{n-1} }} \\ =& n {\frac{ D }{ N }} \sum_{y=0}^{n-1} { \frac{ \binom{D-1}{y} \binom{(N-1) - (D-1)}{(n - 1 )- y} }{ \binom{N-1}{n-1} }} \\ =& n {\frac{ D }{ N }} \cdot 1 \end{align*} $$ The last factor is the sum of the probability mass function of $Y \sim \operatorname{HG}(N-1, D-1, n-1)$, hence it equals $1$.
■
Variance 2
There is a more algebraically neat proof3, but personally I find the derivation that uses intuition about the hypergeometric distribution more enjoyable, so I present the following method.
$$ X = X_{1} + \cdots + X_{n} $$ The random variable $X$ can be written as the sum of indicator random variables $X_{k}$, each of which is $0$ or $1$ as above. The expectation of $X_{k}$ is the probability of selecting $D$ items from $N$ items, so it is $E \left( X_{k} \right) = D / N = p$. In the hypergeometric setting, when drawing $n$ items we sample without replacement, so the $X_{k}$ are not independent; to compute the variance we need the covariance as follows. $$ \begin{align*} \Var (X) =& \Var \left( X_{1} + \cdots + X_{n} \right) \\ =& \sum_{k=1}^{n} \Var \left( X_{k} \right) + \sum_{i \ne j} \Cov \left( X_{i}, X_{j} \right) \end{align*} $$
Each $X_{k}$ is either $0$ or $1$, therefore $X_{k}^{2} = X_{k}$, and the variance of $X_{k}$ is: $$ \begin{align*} \Var \left( X_{k} \right) =& E \left( X_{k}^{2} \right) - E \left( X_{k} \right)^{2} \\ =& E \left( X_{k} \right) - E \left( X_{k} \right)^{2} \\ =& p - p^{2} \\ =& {\frac{ D }{ N }} - \left( {\frac{ D }{ N }} \right)^{2} \\ =& {\frac{ ND - D^{2} }{ N^{2} }} \\ =& {\frac{ D \left( N - D \right) }{ N^{2} }} \end{align*} $$
Since the covariance is $\Cov \left( X_{i}, X_{j} \right) = E \left( X_{i} X_{j} \right) - E \left( X_{i} \right) E \left( X_{j} \right)$, we need the joint distribution of $X_{i}$ and $X_{j}$. Considering the cases where $X_{i}$ and $X_{j}$ are $0$ or $1$ yields the following. $$ X_{i} X_{j} = \begin{cases} 1 & \text{if } X_{i} = 1 \land X_{j} = 1 \\ 0 & \text{otherwise} \end{cases} $$ The probability that $X_{i} X_{j} = 1$ occurs can be expressed as the ratio of favorable cases $D \left( D - 1 \right)$ to the total number of ways to draw two samples $N \left( N - 1 \right)$, and one obtains the expectation $X_{i} X_{j}$. $$ \begin{align*} & P \left( X_{i} X_{j} = 1 \right) = P \left( X_{i} = X_{j} = 1 \right) = {\frac{ D \left( D - 1 \right) }{ N \left( N - 1 \right) }} \\ \implies & E \left( X_{i} X_{j} \right) = 1 \cdot P \left( X_{i} X_{j} = 1 \right) + 0 \cdot P \left( X_{i} X_{j} = 0 \right) = {\frac{ D \left( D - 1 \right) }{ N \left( N - 1 \right) }} \end{align*} $$ Accordingly, the covariance of $X_{i}$ and $X_{j}$ is: $$ \begin{align*} & \Cov \left( X_{i} , X_{j} \right) \\ =& E \left( X_{i} X_{j} \right) - E \left( X_{i} \right) E \left( X_{j} \right) \\ =& {\frac{ D \left( D - 1 \right) }{ N \left( N - 1 \right) }} - \left( {\frac{ D }{ N }} \right)^{2} \\ =& {\frac{ N \left( D^{2} - D \right) - D^{2} \left( N - 1 \right) }{ N^{2} \left( N - 1 \right) }} \\ =& {\frac{ D^{2} - N D }{ N^{2} \left( N - 1 \right) }} \\ =& - {\frac{ D \left( N - D \right) }{ N^{2} \left( N - 1 \right) }} \end{align*} $$ Finally, computing $\Var \left( X \right)$ yields: $$ \begin{align*} & \Var \left( X \right) \\ =& \sum_{k=1}^{n} \Var \left( X_{k} \right) + \sum_{i \ne j} \Cov \left( X_{i}, X_{j} \right) \\ =& \sum_{k=1}^{n} \Var \left( X_{k} \right) + 2 \sum_{i < j} \Cov \left( X_{i}, X_{j} \right) \\ =& n \Var X + 2 \binom{n}{2} \Cov \left( X_{i}, X_{j} \right) \\ =& n {\frac{ D \left( N - D \right) }{ N^{2} }} - 2 {\frac{ n (n-1) }{ 2 }} {\frac{ D \left( N - D \right) }{ N^{2} \left( N - 1 \right) }} \\ =& {\frac{ n D \left( N - D \right) }{ N^{2} }} \left( 1 - {\frac{ n - 1 }{ N - 1 }} \right) \\ =& {\frac{ n D \left( N - D \right) }{ N^{2} }} \cdot {\frac{ N - n }{ N - 1 }} \\ =& n p \left( 1 - p \right) {\frac{ N - n }{ N - 1 }} \end{align*} $$
■
Casella. (2001). Statistical Inference(2nd Edition): p87. ↩︎
https://mathweb.ucsd.edu/~gptesler/186/slides/186_hypergeom_17-handout.pdf ↩︎
heropup, Derivation of mean and variance of Hypergeometric Distribution, URL (version: 2016-02-23): https://math.stackexchange.com/q/1669384 ↩︎