The Variance of an Unbiased Estimator Given a Sufficient Statistic is Minimized
Theorem 1
Let’s say we have a parameter . is an unbiased estimator, is a sufficient statistic, and is a minimal sufficient statistic, defined as follows: it holds that:
Explanation
Whether or is given, being an unbiased estimator means it hits in expectation, but roughly speaking, it does so with less fluctuation when the minimal sufficient statistic is given. It’s easy to remember that the minimality of the sufficient statistic leads to the minimality of the variance of the unbiased estimator.
Proof
Definition of the Minimal Sufficient Statistic: A sufficient statistic is called a minimal sufficient statistic if it can be represented as a function of every other sufficient statistic , denoted by being a function of .
According to the definition of the minimal sufficient statistic, since can be represented as a function of ,
Following the property of conditional variance, for and we have
This holds for any other sufficient statistic , so the variance of the expected value of the unbiased estimator given the minimal sufficient statistic is minimized.
■
Casella. (2001). Statistical Inference(2nd Edition): p305. ↩︎