Pareto Distribution
Definition 1
For the scale parameter $x_{0} > 0$ and the shape parameter $\alpha > 0$, the following probability function is referred to as the Pareto Distribution, Power Law, or Scale-free Distribution:
- Continuous: For a constant $C$ that satisfies constant $\displaystyle \int_{x_{0}}^{\infty} p(x) dx = 1$ $$ p(x) = C x^{-\alpha} \qquad , x > x_{0} $$
- Discrete: For the Riemann zeta function $\zeta$ $$ p_{k} = {{ 1 } \over { \zeta (\alpha) }} k^{-\alpha} \qquad , k \in \mathbb{N} $$
Basic Properties
- [1] Moment generating function: The moment generating function of the Pareto distribution does not exist.
- [2] Mean and variance: If $X \sim \text{Pareto} \left( x_{0}, \alpha \right)$, then $$ \begin{align*} E (X) =& {{ \alpha - 1 } \over { \alpha - 2 }} x_{0} & , \alpha > 2 \\ \operatorname{Var} (X) =& {{ (\alpha - 1) } \over { \left( \alpha -2 \right)^{2} (\alpha - 3) }} x_{0}^{2} & , \alpha > 3 \end{align*} $$
Theorems
- [a] Scale-freeness: The Pareto distribution is the unique Scale-free Distribution. In other words, for all $b$, there exists some constant $\alpha$ such that the following holds true. $$ p(bx) = g(b) p(x) \implies p(x) = p(1) x^{-\alpha} $$
- [b] The $k$th moment: If $0 < k < \alpha - 1$, then the $k$th moment exists and $$ E X^{k} = {{ \alpha - 1 } \over { \alpha - 1 - k }} x_{0}^{k} $$
Description
The Pareto distribution is a representative distribution that explains the prevalent inequality in the real world, closely related to the following concepts:
- Heaps’ law and Zipf’s law: Empirical laws concerning the frequency of words.
- Book sales
- Traffic volume
- Earthquake magnitudes
- Diameters of craters
- Wealth
- Citation count
Looking at the shape of the probability density function, one can intuitively understand that the greater the shape $\alpha$, the more severe the inequality becomes. In economic terms, this means the rich have an endless amount of money, and the poor are plentiful.
The statement that the Pareto distribution possesses scale-freeness literally means there is no scale. For example, if two random variables follow the Poisson distribution with parameters $\lambda_{1} = 10$ and $\lambda_{2} = 1000$, depending on where you look, there can be a big difference, but the Pareto distribution essentially has no difference wherever you look. Mathematically, this corresponds to the conclusion being the same regardless of the value given to $b$.
Proof
[1]
The existence of a random variable’s moment generating function implies that the $k$th moment exists for all $k \in \mathbb{N}$. However, as specified in theorem [2], the $1$th moment of the Pareto distribution exists only when $\alpha > 1$, thus the moment generating function cannot exist.
■
[2]
Strategy: Use the moment formula [b].
Given $$ \begin{align*} EX^{1} =& {{ \alpha - 1 } \over { \alpha - 1 - 1 }} x_{0}^{1} \\ =& {{ \alpha - 1 } \over { \alpha - 2 }} x_{0}^{1} \end{align*} $$, and since $\displaystyle EX^{2} = {{ \alpha - 1 } \over { \alpha - 3 }} x_{0}^{2}$, then $$ \begin{align*} \operatorname{Var} X =& {{ \alpha - 1 } \over { \alpha - 3 }} x_{0}^{2} - \left[ {{ \alpha - 1 } \over { \alpha - 2 }} x_{0}^{1} \right]^{2} \\ =& \left[ {{ 1 } \over { \alpha - 3 }} - {{ \alpha - 1 } \over { \left( \alpha - 2 \right)^{2} }} \right] (\alpha - 1) x_{0}^{2} \\ =& \left[ \alpha^{2} - 4 \alpha + 4 - \alpha^{2} + 4 \alpha - 3 \right] {{ (\alpha - 1) } \over { (\alpha - 3) \left( \alpha -2 \right)^{2} }} x_{0}^{2} \\ =& {{ (\alpha - 1) } \over { \left( \alpha -2 \right)^{2} (\alpha - 3) }} x_{0}^{2} \end{align*} $$
■
[a]
Assuming for all $b$, there exists some function $g$ such that $$ p(bx) = g(b) p(x) $$ holds true. Substituting $x = 1$ gives $p(b) = g(b) p(1)$, hence $g(b) = p(b) / p(1)$ and $$ p(bx) = {{ p(b) p(x) } \over { p(1) }} $$ Differentiating with respect to $b$ yields $$ x p '(bx) = {{ p ' (b) p(x) } \over { p(1) }} $$ Substituting $b=1$ and applying the trick using differentiation of logarithmic functions2 $$ \begin{align*} & x p '(x) = {{ p ' (1) p(x) } \over { p(1) }} \\ \implies & {{ p '(x) } \over { p(x) }} = {{ p '(1) } \over { p(1) }} \cdot {{ 1 } \over { x }} \\ \implies & {{ d \log p(x) } \over { dx }} = {{ p '(1) } \over { p(1) }} \cdot {{ 1 } \over { x }} \\ \implies & d \log p(x) = {{ p '(1) } \over { p(1) }} {{ 1 } \over { x }} dx \end{align*} $$ This forms a simple separable first-order differential equation, for some constant $\text{constant}$, yielding $$ \log p(x) = {{ p '(1) } \over { p(1) }} \log x + \text{constant} $$ Substituting $x = 1$ shows $\text{constant} = \log p(1)$. Defining $\displaystyle \alpha := - {{ p '(1) } \over { p(1) }}$ gives us the desired equation, $$ \begin{align*} & \log p(x) = - \alpha \log x + \log p(1) \\ \implies & \log p(x) = \log x^{-\alpha} + \log p(1) \\ \implies & \log p(x) = \log x^{-\alpha} p(1) \\ \implies & p(x) = p(1) x^{-\alpha} \end{align*} $$
■
[b]
Since $0 < \alpha -1$, from $\displaystyle \int_{x_{0}}^{\infty} C x^{-\alpha} dx = 1$ we obtain $C = \left( \alpha - 1 \right) x_{0}^{\alpha - 1}$. Therefore, $$ \begin{align*} E X^{k} =& \int_{x_{0}}^{\infty} x^{k} C x^{-\alpha} dx \\ =& C \int_{x_{0}}^{\infty} x^{k-\alpha} dx \\ =& \left( \alpha - 1 \right) x_{0}^{\alpha - 1} \left[ {{ 1 } \over { k - \alpha + 1 }} x^{k - \alpha + 1} \right]_{x_{0}}^{\infty} \\ =& \left( \alpha - 1 \right) x_{0}^{\alpha - 1} \left( 0 - {{ 1 } \over { k - \alpha + 1 }} x_{0}^{k - \alpha + 1} \right) \\ =& {{ \alpha - 1 } \over { \alpha - 1 - k }} x_{0}^{k} \end{align*} $$
■
Visualization
The following is Julia code that displays the probability density function of the Pareto distribution as a GIF.
@time using LaTeXStrings
@time using Distributions
@time using Plots
cd(@__DIR__)
x = 1:0.1:10
A = collect(0.5:0.01:3.5); append!(A, reverse(A))
animation = @animate for α ∈ A
plot(x, pdf.(Pareto(α), x),
color = :black,
label = "α = $(round(α, digits = 2))", size = (400,300))
xlims!(0,5); ylims!(0,4); title!(L"\mathrm{pdf\,of\,Pareto}(\alpha)")
end
gif(animation, "pdf.gif")
Newman. (2005). Power laws, Pareto distributions and Zipf’s law. https://doi.org/10.1080/00107510500052444 ↩︎