logo

The Unique Maximum Likelihood Estimator Depends on the Sufficient Statistic 📂Mathematical Statistics

The Unique Maximum Likelihood Estimator Depends on the Sufficient Statistic

Theorem

If a sufficient statistic TT exists for a parameter θ\theta and a unique maximum likelihood estimator θ^\hat{\theta} for θ\theta exists, then θ^\hat{\theta} can be represented as a function of TT.

Proof 1

Consider a random sample X1,,XnX_{1} , \cdots , X_{n} with a probability density function f(x;θ)f \left( x ; \theta \right) and its sufficient statistic T:=T(X1,,Xn)T := T \left( X_{1} , \cdots , X_{n} \right) and probability density function ftf_{t}. According to the definition of a sufficient statistic, its likelihood function LL can be expressed as L(θ;x1,,xn)=f(x1;θ)f(xn;θ)=fT(t(x1,,xn);θ)H(x1,,xn) \begin{align*} & L \left( \theta ; x_{1} , \cdots , x_{n} \right) \\ =& f \left( x_{1} ; \theta \right) \cdots f \left( x_{n} ; \theta \right) \\ =& f_{T} \left( t \left( x_{1} , \cdots , x_{n} \right) ; \theta \right) \cdot H \left( x_{1} , \cdots , x_{n} \right) \end{align*} where HH is some function that does not depend on θ\theta. Since both LL and fTf_{T} are dependent on θ\theta, if maximized, they would be maximized simultaneously. Given the assumption that the maximizing θ\theta is unique, it follows that the maximum likelihood estimator θ^\hat{\theta} for θ\theta must be dependent on TT.

Explanation

For example, consider a random sample following a uniform distribution U(0,θ)U (0, \theta), where the sufficient statistic is maxXk\max X_{k}, and so is the maximum likelihood estimator, making it consistent with this theorem.


  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p397. ↩︎