logo

Jeffreys Prior Distribution 📂Mathematical Statistics

Jeffreys Prior Distribution

Definitions 1

The distribution $p( y | \theta)$ is called the Jeffreys prior for $\pi ( \theta ) \propto I^{1/2} ( \theta )$.


  • $I$ refers to the Fisher information. $$ I ( \theta ) = E \left[ \left( \left. {{\partial \ln p (y | \theta) } \over {\partial \theta}} \right)^2 \right| \theta \right] = E \left[ \left. - {{\partial^2 \ln p (y | \theta) } \over { (\partial \theta )^2 }} \right| \theta \right] $$

Description

While the Laplace prior $\pi (\theta) \propto 1$ was sufficient as a prior for the parameter $\theta$, for a function of the parameter, such as $\phi = \theta^2$, it becomes $d \phi = 2 \theta d \theta$ and hence $\displaystyle \pi (\phi ) \propto {{1} \over {\sqrt{\phi } }}$, making it not the same prior as for $\theta$. The Jeffreys prior overcomes this lack of invariance, and is essentially an upgrade over the Laplace prior.

Examples

For instance, when the data follow an exponential distribution $\displaystyle \exp \left( {{1} \over {\theta}} \right)$, the Laplace prior $\displaystyle \pi ( \theta ) \propto c$ had the issue of leading to improper posterior.

First, calculating the Jeffreys prior, $\displaystyle p( y | \theta ) = {{1} \over { \theta }} \exp \left( - {{ y } \over { \theta }} \right)$ results in $$ {{\partial \ln p (y | \theta) } \over {\partial \theta}} = - {{1 } \over { \theta }} + {{ y} \over { \theta^2 }} $$ Then, differentiating again with respect to $\theta$, we get $\displaystyle p( y | \theta ) = {{1} \over { \theta }} \exp \left( - {{ y } \over { \theta }} \right)$ and thus $$ {{\partial^2 \ln p (y | \theta) } \over { (\partial \theta )^2 }} = {{1 } \over { \theta ^2}} - {{ 2 y} \over { \theta^3 }} $$ Consequently, $$ E \left[ \left. - {{\partial^2 \ln p (y | \theta) } \over { (\partial \theta )^2 }} \right| \theta \right] = {{ 2 \theta } \over { \theta^3 }} - {{1 } \over { \theta ^2}} = {{1 } \over { \theta ^2}} $$ and we obtain the Jeffreys prior $\displaystyle \pi ( \theta ) = {{1 } \over { \theta }}$.

To check if this posterior is appropriate, setting $\displaystyle \theta = {{1} \over {z}}$ and computing the definite integral results in $$ \int_{0 }^{\infty} p ( \theta | y ) d \theta \propto \int_{0}^{\infty} z^2 \exp ( - y z ) {{1} \over {z^2}} dx = {{1} \over {y}} < \infty $$ Therefore, in this case, it can be confirmed that the Jeffreys prior induced an appropriate posterior.


  1. 김달호. (2013). R과 WinBUGS를 이용한 베이지안 통계학: p118. ↩︎