logo

The Unique Maximum Likelihood Estimator Depends on the Sufficient Statistic 📂Mathematical Statistics

The Unique Maximum Likelihood Estimator Depends on the Sufficient Statistic

Theorem

If a sufficient statistic $T$ exists for a parameter $\theta$ and a unique maximum likelihood estimator $\hat{\theta}$ for $\theta$ exists, then $\hat{\theta}$ can be represented as a function of $T$.

Proof 1

Consider a random sample $X_{1} , \cdots , X_{n}$ with a probability density function $f \left( x ; \theta \right)$ and its sufficient statistic $T := T \left( X_{1} , \cdots , X_{n} \right)$ and probability density function $f_{t}$. According to the definition of a sufficient statistic, its likelihood function $L$ can be expressed as $$ \begin{align*} & L \left( \theta ; x_{1} , \cdots , x_{n} \right) \\ =& f \left( x_{1} ; \theta \right) \cdots f \left( x_{n} ; \theta \right) \\ =& f_{T} \left( t \left( x_{1} , \cdots , x_{n} \right) ; \theta \right) \cdot H \left( x_{1} , \cdots , x_{n} \right) \end{align*} $$ where $H$ is some function that does not depend on $\theta$. Since both $L$ and $f_{T}$ are dependent on $\theta$, if maximized, they would be maximized simultaneously. Given the assumption that the maximizing $\theta$ is unique, it follows that the maximum likelihood estimator $\hat{\theta}$ for $\theta$ must be dependent on $T$.

Explanation

For example, consider a random sample following a uniform distribution $U (0, \theta)$, where the sufficient statistic is $\max X_{k}$, and so is the maximum likelihood estimator, making it consistent with this theorem.


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p397. ↩︎