logo

Minimum Variance Unbiased Estimator Uniqueness 📂Mathematical Statistics

Minimum Variance Unbiased Estimator Uniqueness

Theorem 1

If WW is the Best Unbiased Estimator for τ(θ)\tau (\theta), then WW is unique.

Proof

Cauchy-Schwarz Inequality: For the Random Variable X,YX, Y, the following holds: Cov(X,Y)VarXVarY \operatorname{Cov} (X,Y) \le \operatorname{Var} X \operatorname{Var} Y The necessary and sufficient condition for equality to hold is as follows: a0,bR:aX+b=Y \exist a \ne 0 , b \in \mathbb{R} : a X + b = Y


Assuming ww ' is another Best Unbiased Estimator for WW, and considering W:=(W+W)/2W^{\ast} := \left( W + W’ \right) / 2, its expectation is EθW=(τ(θ)+τ(θ))/2=τ(θ) E_{\theta} W^{\ast} = \left( \tau (\theta) + \tau (\theta) \right) / 2 = \tau (\theta) and its variance is VarθW=Varθ(12W+12W)=14VarθW+14VarθW+12Covθ(W,W)14VarθW+14VarθW+12VarθWVarθW=VarθW \begin{align*} \operatorname{Var}_{\theta} W^{\ast} =& \operatorname{Var}_{\theta} \left( {{ 1 } \over { 2 }} W + {{ 1 } \over { 2 }} W’ \right) \\ =& {{ 1 } \over { 4 }} \operatorname{Var}_{\theta} W + {{ 1 } \over { 4 }} \operatorname{Var}_{\theta} W’ + {{ 1 } \over { 2 }} \operatorname{Cov}_{\theta} \left( W, W’ \right) \\ \le& {{ 1 } \over { 4 }} \operatorname{Var}_{\theta} W + {{ 1 } \over { 4 }} \operatorname{Var}_{\theta} W’ + {{ 1 } \over { 2 }} \sqrt{\operatorname{Var}_{\theta} W \cdot \operatorname{Var}_{\theta} W’} \\ =& \operatorname{Var}_{\theta} W \end{align*}

If the inequality << holds, this contradicts the premise that WW is the Best Unbiased Estimator, hence it suffices to show that the equality == holds for all θ\theta. The necessary and sufficient condition for only the equality to hold is for some a(θ)0a (\theta) \ne 0 and b(θ)Rb(\theta) \in \mathbb{R}, a(θ)W+b(θ)=wa (\theta) W + b(\theta) = w ' holds, and upon direct calculation, according to the Properties of Covariance, Covθ(W,W)=Covθ(W,W)=Covθ(W,a(θ)W+b(θ))=Covθ(W,a(θ)W)=a(θ)VarθW \begin{align*} \operatorname{Cov}_{\theta} \left( W, W’ \right) =& \operatorname{Cov}_{\theta} \left( W, W’ \right) \\ =& \operatorname{Cov}_{\theta} \left( W, a (\theta) W + b(\theta) \right) \\ =& \operatorname{Cov}_{\theta} \left( W, a (\theta) W \right) \\ =& a (\theta) \operatorname{Var}_{\theta} W \end{align*} Having already established that Covθ(W,W)=VarθW\operatorname{Cov}_{\theta} \left( W, W’ \right) = \operatorname{Var}_{\theta} W, it follows that a(θ)=1a(\theta) = 1, and since Eθτ(θ)E_{\theta} \tau (\theta) it implies b(θ)=0b(\theta) = 0, which proves W=wW = w '.


  1. Casella. (2001). Statistical Inference(2nd Edition): p343. ↩︎