logo

Unbiased Estimators and the Cramér-Rao Bound 📂Mathematical Statistics

Unbiased Estimators and the Cramér-Rao Bound

Theorem

Regularity Conditions:

  • (R0): The probability density function $f$ is injective with respect to $\theta$. It satisfies the following equation. $$ \theta \ne \theta’ \implies f \left( x_{k} ; \theta \right) \ne f \left( x_{k} ; \theta’ \right) $$
  • (R1): The probability density function $f$ has the same support for all $\theta$.
  • (R2): The true value $\theta_{0}$ is an interior point of $\Omega$.
  • (R3): The probability density function $f$ is twice differentiable with respect to $\theta$.
  • (R4): The integral $\int f (x; \theta) dx$ is twice differentiable with respect to $\theta$, even when interchanging the order of integration.

Let’s define a likelihood function $L (\theta | \mathbf{X} ) := \prod_{k=1}^{n} f \left( x_{k} | \theta \right)$ when a random sample $X_{1} , \cdots , X_{n}$ that satisfies the Regularity Conditions (R0)-(R4) is drawn from $f (x|\theta)$. If $W \left( \mathbf{X} \right) = W \left( X_{1} , \cdots , X_{n} \right)$ is an unbiased estimator for $\tau \left( \theta \right)$, then having a Cramér-Rao Lower Bound $\text{RC}$ for $W \left( \mathbf{X} \right)$ is equivalent to the following condition being true for some function $a(\theta)$. $$ a \left( \theta \right) \left[ W \left( \mathbf{X} \right) - \tau (\theta) \right] = {{ \partial \log L (\theta | \mathbf{X}) } \over { \partial \theta }} $$

Explanation

In summary, it means that when $W \left( \mathbf{X} \right) - \tau (\theta)$ is proportional to ${{ \partial \log L (\theta | \mathbf{X}) } \over { \partial \theta }}$, $\operatorname{Var} W \left( \mathbf{X} \right) = \text{RC}$ follows. The proof of the theorem1 itself is not very difficult but is omitted here as it relies heavily on logic that is conveniently used only in the specific textbook.


  1. Casella. (2001). Statistical Inference(2nd Edition): p336~341. ↩︎