logo

Cauchy Distribution: A Distribution Without a Mean 📂Probability Distribution

Cauchy Distribution: A Distribution Without a Mean

Definition

pdf.png

The continuous probability distribution with the following probability density function is called a Cauchy distribution. $C$ $$ f(x) = {1 \over \pi} {1 \over {x^2 + 1}} \qquad , x \in \mathbb{R} $$

Explanation

It may seem like all probability distributions would have a mean and variance, but in reality, that’s not always the case. A prime example of this is the Cauchy distribution, which at a glance resembles the normal distribution but has thicker tails on both sides. Regardless of the parameters, since there is no moment-generating function, it means that everything involving moments—including the population mean and variance—cannot exist.

Of course, whether or not there’s a population mean, a sample mean can still be calculated. In fact, for a Cauchy distribution translated by $\theta$ along the $x$ axis, the mle of $\theta$ $\hat{\theta}$ appears as the sample mean.

Meanwhile, the probability density function of the t-distribution is: $$ g(y) = {{\Gamma ( (n+1)/2 ) } \over { \sqrt{\pi n} \Gamma (n/2) }} { {1} \over {(1 + y^{2} / n)^{(n+1)/2} } } $$ Thus, the Cauchy distribution can be seen as a t-distribution with degrees of freedom $n=1$.

Theorem

The moment-generating function of the Cauchy distribution does not exist.

Proof1

The probability distribution function of the Cauchy distribution is given by $\displaystyle f(x) = {1 \over \pi} {1 \over {x^2 + 1}}, -\infty < x < \infty$. To show that the moment-generating function $\displaystyle E(e^{tx}) = \int_{-\infty}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx$ diverges, consider when $t>0$ then, by the Mean Value Theorem, $$ {{e^{tx} - e^0} \over {tx - 0}} = { { e^{tx} - 1 } \over {tx} } = e^{\xi} \ge e^0 = 1 $$ there exists a $0< \xi < tx$ that satisfies the above equation. A little rearrangement of the equation yields the following inequality: $$ e^{tx} \ge 1 + tx \ge tx $$ Returning to the integral: $$ \begin{align*} E(e^{tx}) \ge& \int_{-\infty}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx \\ \ge& \int_{0}^{\infty} e^{tx} {1 \over \pi} {1 \over {x^2 + 1}} dx \\ \ge& \int_{0}^{\infty} {1 \over \pi} {tx \over {x^2 + 1}} dx \\ =& { t \over {2 \pi} } \left[ \ln (x^2+1) \right]_{0}^{\infty} \\ =& \infty \end{align*} $$ Therefore, the moment-generating function of the Cauchy distribution does not exist.

Code

The following is Julia code that displays the probability density functions of the Cauchy distribution, the t-distribution, and the Cauchy distribution.

@time using LaTeXStrings
@time using Distributions
@time using Plots

cd(@__DIR__)

x = -4:0.1:4
plot(x, pdf.(Cauchy(), x),
 color = :red,
 label = "Cauchy", size = (400,300))
plot!(x, pdf.(TDist(3), x),
 color = :orange,
 label = "t(3)", size = (400,300))
plot!(x, pdf.(TDist(30), x),
 color = :black, linestyle = :dash,
 label = "t(30)", size = (400,300))
plot!(x, pdf.(Normal(), x),
 color = :black,
 label = "Standard Normal", size = (400,300))

xlims!(-4,5); ylims!(0,0.5); title!(L"\mathrm{pdf\,of\, t}(\nu)")
png("pdf")

  1. Hogg et al. (2013). Introduction to Mathematical Statistcs(7th Edition): p63. ↩︎