logo

The Mean and Variance of the Bernoulli Distribution 📂Probability Distribution

The Mean and Variance of the Bernoulli Distribution

Formula

Given $X \sim$ when $\operatorname{Bin}(1, p)$, the mean and variance of $X$ are as follows.

$$ E(X) = p $$

$$ \Var(X) = p(1-p) = pq, \qquad q = 1 - p $$

Proof

For $p \in [0, 1]$, a discrete probability distribution with the following probability mass function is called a Bernoulli distribution.

$$ f(x) = p^{x}(1-p)^{1-x}, \qquad x = 0, 1 $$

Direct Calculation

By the definition of expected value,

$$ \begin{align*} E(X) &= \sum\limits_{x = 0, 1} x f(x) \\ &= 0 \cdot f(0) + 1 \cdot f(1) \\ &= 0 \cdot (1-p) + 1 \cdot p \\ &= p \end{align*} $$

To obtain the variance, let’s calculate $E(X^{2})$.

$$ \begin{align*} E(X^{2}) &= \sum\limits_{x = 0, 1} x^{2} f(x) \\ &= 0^{2} \cdot f(0) + 1^{2} \cdot f(1) \\ &= 0^{2} \cdot (1-p) + 1^{2} \cdot p \\ &= p \end{align*} $$

The variance is $\Var(X) = E(X^{2}) - E(X)^{2}$, thus

$$ \Var(X) = p - p^{2} = p(1-p) = pq $$

From the Moment Generating Function

The moment generating function of the Bernoulli distribution is as follows.

$$ m(t) = 1 - p + pe^{t} = q + pe^{t} $$

The expected value is $m^{\prime}(0)$, therefore

$$ E(X) = m^{\prime}(0) = pe^{t}|_{t=0} = p $$

To find the variance, let’s calculate $m^{\prime\prime}(0)$.

$$ m^{\prime\prime}(t) = p e^{t}|_{t=0} = p $$

Thus, the variance is $\Var(X) = m^{\prime\prime}(0) - m^{\prime}(0)^{2}$, and consequently

$$ \Var(X) = p - p^{2} = p(1-p) = pq $$