Negative Binomial Distribution
Definition 1
Given $r \in \mathbb{N}$ and $p \in (0,1]$, a discrete probability distribution $\text{NB}(r,p)$ with the following probability mass function is called the Negative Binomial Distribution. $$ p(x) = \binom{r+x-1}{x-1} p^{r}(1-p)^{x} \qquad, x = 0,1,2,\cdots $$
Basic Properties
Moment Generating Function
- [1]: $$m(t) = \left[ {{ p } \over { 1 - (1-p) e^{t} }} \right]^{r} \qquad , t < -\log (1-P)$$
Mean and Variance
- [2]: If $X \sim \text{NB}(r, p)$, then $$ \begin{align*} E(X) =& {{ r (1-p) } \over { p }} \\ \operatorname{Var}(X) =& {{ r (1-p) } \over { p^{2} }}\end{align*} $$
Description
The negative binomial distribution is concerned with the number of trials needed for an event with a probability $p$ to occur $r$ times. For example, consider how many times one must flip a coin until it lands on heads twice. Given that the probability of landing on heads is $50%$, it would take about two flips to get heads once, and needing that to happen one more time gives us an expected value of $4$.
Intuitively, the negative binomial distribution can be seen as a generalization of the geometric distribution with the number of successes $r$ generalized. In fact, when the number of successes is one, i.e., $r = 1$, it exactly becomes the geometric distribution.
Naming
The reason for calling it a negative binomial distribution is because the shape of its probability mass function is related to the negative binomial coefficient.
Theorem
Generalization of Geometric Distribution
- [b]: If $Y = X_{1} + \cdots + X_{r}$ and $X_{i} \overset{\text{iid}}{\sim} \text{Geo}(p)$, then $Y \sim \text{NB}(r,p)$
Proof
[1]
Negative Binomial Coefficient: $$ (-1)^{k} \binom{-r}{k} = \binom{r + k - 1}{ k } $$
$$ \begin{align*} m(t) =& \sum_{x=0}^{\infty} e^{tx} p(x) \\ =& \sum_{x=0}^{\infty} e^{tx} \binom{r+x-1}{x} p^{r} (1-p)^{x} \\ =& p^{r}\sum_{x=0}^{\infty} \binom{-r}{x} (-1)^{x} \left[ (1-p) e^{t} \right]^{x} \\ =& p^{r}\sum_{x=0}^{\infty} \binom{-r}{x} \left[ - (1-p) e^{t} \right]^{x} \end{align*} $$
Binomial Series: If $|x| < 1$, then for $\alpha \in \mathbb{C}$, $\displaystyle (1 + x )^{\alpha} = \sum_{k=0}^{\infty} \binom{\alpha}{k} x^{k}$
Based on the binomial series, since $\displaystyle \sum_{x=0}^{\infty} \binom{-r}{x} \left[ - (1-p) e^{t} \right]^{x} = \left[ 1 - (1-p) e^{t} \right]^{-r}$, $$ m(t) = \left[ {{ p } \over { 1 - (1-p) e^{t} }} \right]^{r} \qquad , t < -\log (1-P) $$
■
[2]
Using the concept of Generalization of Geometric Distributions.
■
[b]
When the probability mass function of a geometric distribution is defined as $p(x) = p (1-p)^{x} \qquad,x=0,1,2,\cdots$, its moment generating function is as follows: $$ m(t) = p \left( 1 - (1-p) e^{t} \right)^{-1} $$ Since the random variables $X_1, X_2, \cdots , X_r$, which are mutually independent, follow $\text{Geo} (p)$, the moment generating function for $Y$ is $$ \begin{align*} M_Y(t) =& E(e^{Yt}) \\ =& E(e^{(X_1+X_2+\cdots+X_r)t}) \\ =& E(e^{X_1 t}) E(e^{X_2 t}) \cdots E(e^{X_r t}) \\ =& \prod_{i=1}^r p { (1 - (1-p) e^t ) }^{-1} \\ =& p^r \left\{ (1 - (1-p) e^t ) \right\}^{-r} \end{align*} $$ This is identical to the moment generating function for the negative binomial distribution $\text{NB}(r,p)$, therefore $Y \sim \text{NB}(r,p)$
■
Code
Below is a Julia code to show the probability mass function of the negative binomial distribution as a gif.
@time using LaTeXStrings
@time using Distributions
@time using Plots
cd(@__DIR__)
x = 0:20
P = collect(0.2:0.01:0.8); append!(P, reverse(P))
animation = @animate for p ∈ P
scatter(x, pdf.(NegativeBinomial(5, p), x),
color = :black, markerstrokecolor = :black,
label = "r = 5, p = $(rpad(p, 4, '0'))", size = (400,300))
xlims!(0,20); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,NB}(5, p)")
end
gif(animation, "pmf5.gif")
animation = @animate for p ∈ P
scatter(x, pdf.(NegativeBinomial(10, p), x),
color = :black, markerstrokecolor = :black,
label = "r = 10, p = $(rpad(p, 4, '0'))", size = (400,300))
xlims!(0,20); ylims!(0,0.5); title!(L"\mathrm{pmf\,of\,NB}(10, p)")
end
gif(animation, "pmf10.gif")
Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p145. ↩︎