logo

Unbiased Estimator 📂Mathematical Statistics

Unbiased Estimator

Definition 1

If the estimator $T$ of $\theta$ satisfies the following, then $T$ is called the unbiased estimator of $\theta$. $$ E T = \theta $$

Explanation

Especially, among the unbiased estimators for $\theta$, the one with the smallest variance is called the minimum variance unbiased estimator.

Unbiasedness refers to the property of not having any bias. For example, when we assume $X_{i} \sim \left( \mu , \sigma^{2} \right)$, if we use the sample mean $\displaystyle \overline{X} = {{ 1 } \over { n }} \sum_{i} X_{i}$ as the estimator for $\mu$, since $\displaystyle E \overline{X} = \mu$, then $\overline{X}$ becomes an unbiased estimator of $\mu$. This might seem obvious at first, but the fact that an estimator perfectly estimates the parameter is a very important characteristic and not at all a given. For instance, let’s consider the variance and sample variance.

Example

If we assume $$ X_{i} \sim \left( \mu , \sigma^{2} \right) $$, the unbiased estimator for variance is as follows: $$ S^{2} := {{1} \over {n-1}} \sum_{i=1}^{n} \left( X_{i} - \overline{X} \right)^{2} $$ As known, unlike the sample mean, the sample variance sums up all squared deviations and divides by $n-1$ not $n$. The reason we divide by $n-1$ when calculating the sample variance why we divide by $n-1$ when calculating sample variance can be explained in various ways depending on the person’s level of understanding, but the most accurate formulaic explanation is ‘in order for the sample variance to be an unbiased estimator’.

Proof 2

If we assume $$ \mu := E \overline{X} \\ \sigma^{2} := E X_{i} ^{2} - \mu^{2} $$, then $$ \begin{align*} E \left( \overline{X}^{2} \right) - \mu^{2} =& E \left( \overline{X}^{2} \right) - \left( E \overline{X} \right)^{2} \\ =& \operatorname{Var} \overline{X} \\ =& \operatorname{Var} \left( {{1} \over {n}} \sum_{i=1}^{n} X_{i} \right) \\ =& {{1} \over {n^{2}}} \sum_{i=1}^{n} \operatorname{Var} X_{i} \\ =& {{1} \over {n^{2}}} n \sigma^{2} \\ =& {{\sigma^{2}} \over {n}} \end{align*} $$, thus the expected value of the sample variance $S^{2}$ is $$ \begin{align*} E S^{2} =& (n-1)^{-1} E \sum_{i=1}^{n} \left( X_{i} - \overline{X} \right)^{2} \\ =& (n-1)^{-1} \left[ \sum_{i=1}^{n} E X_{i}^{2} - \sum_{i=1}^{n} E \overline{X} ^{2} \right] \\ =& (n-1)^{-1} \left[ \sum_{i=1}^{n} \left( \sigma^{2} + \mu^{2} \right) - n \left( \mu^{2} + {{\sigma^{2}} \over {n}} \right) \right] \\ =& (n-1)^{-1} \left[ n\sigma^{2} + n \mu^{2} - n \mu^{2} - \sigma^{2} \right] \\ =& (n-1)^{-1} (n-1) \sigma^{2} \\ =& \sigma^{2} \end{align*} $$


  1. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p208. ↩︎

  2. Hogg et al. (2013). Introduction to Mathematical Statistics(7th Edition): p137. ↩︎