logo

Cauchy Distribution: A Distribution Without a Mean 📂Probability Distribution

Cauchy Distribution: A Distribution Without a Mean

Definition

pdf.png

The continuous probability distribution with the following probability density function is called a Cauchy distribution. CC f(x)=1π1x2+1,xR f(x) = {1 \over \pi} {1 \over {x^2 + 1}} \qquad , x \in \mathbb{R}

Explanation

It may seem like all probability distributions would have a mean and variance, but in reality, that’s not always the case. A prime example of this is the Cauchy distribution, which at a glance resembles the normal distribution but has thicker tails on both sides. Regardless of the parameters, since there is no moment-generating function, it means that everything involving moments—including the population mean and variance—cannot exist.

Of course, whether or not there’s a population mean, a sample mean can still be calculated. In fact, for a Cauchy distribution translated by θ\theta along the xx axis, the mle of θ\theta θ^\hat{\theta} appears as the sample mean.

Meanwhile, the probability density function of the t-distribution is: g(y)=Γ((n+1)/2)πnΓ(n/2)1(1+y2/n)(n+1)/2 g(y) = {{\Gamma ( (n+1)/2 ) } \over { \sqrt{\pi n} \Gamma (n/2) }} { {1} \over {(1 + y^{2} / n)^{(n+1)/2} } } Thus, the Cauchy distribution can be seen as a t-distribution with degrees of freedom n=1n=1.

Theorem

The moment-generating function of the Cauchy distribution does not exist.

Proof1

The probability distribution function of the Cauchy distribution is given by f(x)=1π1x2+1,<x<\displaystyle f(x) = {1 \over \pi} {1 \over {x^2 + 1}}, -\infty < x < \infty. To show that the moment-generating function E(etx)=etx1π1x2+1dx\displaystyle E(e^{tx}) = \int_{-\infty}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx diverges, consider when t>0t>0 then, by the Mean Value Theorem, etxe0tx0=etx1tx=eξe0=1 {{e^{tx} - e^0} \over {tx - 0}} = { { e^{tx} - 1 } \over {tx} } = e^{\xi} \ge e^0 = 1 there exists a 0<ξ<tx0< \xi < tx that satisfies the above equation. A little rearrangement of the equation yields the following inequality: etx1+txtx e^{tx} \ge 1 + tx \ge tx Returning to the integral: E(etx)etx1π1x2+1dx0etx1π1x2+1dx01πtxx2+1dx=t2π[ln(x2+1)]0= \begin{align*} E(e^{tx}) \ge& \int_{-\infty}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx \\ \ge& \int_{0}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx \\ \ge& \int_{0}^{\infty} {1 \over \pi} {tx \over {x^2 + 1}} dx \\ =& { t \over {2 \pi} } \left[ \ln (x^2+1) \right]_{0}^{\infty} \\ =& \infty \end{align*} Therefore, the moment-generating function of the Cauchy distribution does not exist.

Code

The following is Julia code that displays the probability density functions of the Cauchy distribution, the t-distribution, and the Cauchy distribution.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = -4:0.1:4
plot(x, pdf.(Cauchy(), x),
 color = :red,
 label = "Cauchy", size = (400,300))
plot!(x, pdf.(TDist(3), x),
 color = :orange,
 label = "t(3)", size = (400,300))
plot!(x, pdf.(TDist(30), x),
 color = :black, linestyle = :dash,
 label = "t(30)", size = (400,300))
plot!(x, pdf.(Normal(), x),
 color = :black,
 label = "Standard Normal", size = (400,300))

xlims!(-4,5); ylims!(0,0.5); title!(L"\mathrm{pdf\,of\, t}(\nu)")
png("pdf")

  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p63. ↩︎