Sufficient Statistics and Maximum Likelihood Estimators of the Poisson Distribution
📂Probability DistributionSufficient Statistics and Maximum Likelihood Estimators of the Poisson Distribution
Theorem
Given a random sample X:=(X1,⋯,Xn)∼Poi(λ) following a Poisson distribution,
the sufficient statistic T and the maximum likelihood estimator λ^ for λ are as follows:
T=λ^=k=1∑nXkn1k=1∑nXk
Proof
Sufficient Statistic
f(x;λ)====k=1∏nf(xk;λ)k=1∏nxk!e−λλxk∏kxk!e−nλλ∑kxke−nλλ∑kxk⋅∏kxk!1
Neyman Factorization Theorem: Let a random sample X1,⋯,Xn have the same probability mass/density function f(x;θ) for parameter θ∈Θ. A statistic Y=u1(X1,⋯,Xn) is a sufficient statistic for θ if there exist two non-negative functions k1,k2≥0 satisfying the following condition:
f(x1;θ)⋯f(xn;θ)=k1[u1(x1,⋯,xn);θ]k2(x1,⋯,xn)
Note, k2 must not depend on θ.
According to the Neyman Factorization Theorem, T:=∑kXk is the sufficient statistic for λ.
Maximum Likelihood Estimator
logL(λ;x)===logf(x;λ)log∏kxk!e−nλλ∑kxk−nλ+k=1∑nxklogλ−logk∏xk!
The log-likelihood function of the random sample is as above, and for the likelihood function to be maximized, the partial derivative with respect to λ must be 0, therefore
⟹0=−n+k=1∑nxkλ1λ=n1k=1∑nxk
Consequently, the maximum likelihood estimator λ^ for λ is as follows:
λ^=n1k=1∑nXk
■