The Variance of an Unbiased Estimator Given a Sufficient Statistic is Minimized
📂Mathematical StatisticsThe Variance of an Unbiased Estimator Given a Sufficient Statistic is Minimized
Theorem
Let’s say we have a parameter θ. U is an unbiased estimator, T1 is a sufficient statistic, and T2 is a minimal sufficient statistic, defined as follows:
U1:=U2:=E(U∣T1)E(U∣T2)
it holds that:
VarU2≤VarU1
Explanation
Whether T1 or T2 is given, U being an unbiased estimator means it hits θ in expectation, but roughly speaking, it does so with less fluctuation when the minimal sufficient statistic is given. It’s easy to remember that the minimality of the sufficient statistic leads to the minimality of the variance of the unbiased estimator.
Proof
Definition of the Minimal Sufficient Statistic: A sufficient statistic T(X) is called a minimal sufficient statistic if it can be represented as a function of every other sufficient statistic T′(X), denoted by T(x) being a function of T′(x).
According to the definition of the minimal sufficient statistic, since T2 can be represented as a function of T1,
E(U1∣T2)===E(E(U∣T1)∣T2)E(U∣T2)U2
Property of Conditional Variance:
Var(X)=E(Var(X∣Y))+Var(E(X∣Y))
Following the property of conditional variance, for U1 and T2 we have
VarU1==EVar(U1∣T2)+VarE(U1∣T2)EVar(U1∣T2)+VarU2
This holds for any other sufficient statistic T1, so the variance of the expected value of the unbiased estimator U given the minimal sufficient statistic T2 is minimized.
■