Sufficient Statistics and Maximum Likelihood Estimators of Exponential Distributions
📂Probability DistributionSufficient Statistics and Maximum Likelihood Estimators of Exponential Distributions
Theorem
Given a random sample X:=(X1,⋯,Xn)∼exp(λ) that follows an exponential distribution.
The sufficient statistic T and maximum likelihood estimator λ^ for λ are as follows.
T=λ^=k=1∑nXk∑k=1nXkn
Proof
Sufficient Statistic
f(x;λ)====k=1∏nf(xk;λ)k=1∏nλe−λxkλne−λ∑kxkλne−λ∑kxk⋅1
Neyman Factorization Theorem: Let a random sample X1,⋯,Xn have the same probability mass/density function f(x;θ) for a parameter θ∈Θ. A statistic Y=u1(X1,⋯,Xn) is a sufficient statistic for θ if there exist two non-negative functions k1,k2≥0 that satisfy the following.
f(x1;θ)⋯f(xn;θ)=k1[u1(x1,⋯,xn);θ]k2(x1,⋯,xn)
Note, k2 must not depend on θ.
According to the Neyman Factorization Theorem, T:=∑kXk is the sufficient statistic for λ.
Maximum Likelihood Estimator
logL(λ;x)===logf(x;λ)logλne−λ∑kxknlogλ−λk=1∑nxk
The log-likelihood function of the random sample is as above, and for the likelihood function to be maximized, the partial derivative with respect to λ must be 0, thus
⟹0=nλ1−k=1∑nxkλ=∑k=1nxkn
Therefore, the maximum likelihood estimator λ^ for λ is as follows.
λ^=∑k=1nXkn
■