Convolution Formula of Probability Density Functions
Formula 1
Let two continuous random variables $X, Y$ be independent, and suppose their probability density functions are given by $f_{X}, f_{Y}$. Then the probability density function of $Z := X + Y$ is the convolution $f_{Z} = f_{X} \ast f_{Y}$ of the two density functions. $$ f_{Z} (z) = \left( f_{X} \ast f_{Y} \right) (z) = \int_{-\infty}^{\infty} f_{X} (w) f_{Y} (z-w) dw $$
Proof
Simple derivation
If we set $W := X$, the Jacobian is $$ \begin{Vmatrix} 1 & 1 \\ 1 & 0 \end{Vmatrix} = \left| -1 \right| = 1 $$ and the joint probability density function $f_{Z,W}$ of $Z$ and $W$ is $$ f_{Z,W} \left( z,w \right) = f_{X,Y} \left( w, z-w \right) = f_{X} (w) f_{Y} (z-w) $$ Therefore, the marginal probability density function of $Z$ is obtained by the definite integral over $-\infty < w < \infty$ as follows. $$ f_{Z} (z) = \int_{-\infty}^{\infty} f_{X} (w) f_{Y} (z-w) |1| dw $$
■
Proof using the cumulative distribution function
The cumulative distribution function of $Z$ is
$$ F_{Z}(z) = P(Z \le z) = P(X+Y \le z) $$
Probability is the integral of the probability density function, and assuming $X$ and $Y$ are independent, we obtain
$$ F_{Z}(z) = \iint\limits_{x+y \le z} f_{X,Y}(x,y) \mathrm{d}x \mathrm{d}y = \iint\limits_{x+y \le z} f_{X}(x) f_{Y}(y) \mathrm{d}x \mathrm{d}y $$
Separating the integration limits for each variable,
$$ \begin{align*} F_{Z}(z) &= \int\limits_{-\infty}^{\infty} \int\limits_{-\infty}^{z-x} f_{X}(x) f_{Y}(y) \mathrm{d}y \mathrm{d}x \\ &= \int\limits_{-\infty}^{\infty} f_{X}(x) \left( \int\limits_{-\infty}^{z-x} f_{Y}(y) \mathrm{d}y\right) \mathrm{d}x \\ \end{align*} $$
The value inside the parentheses is the cumulative distribution function of $Y$, so
$$ F_{Z}(z) = \int\limits_{-\infty}^{\infty} f_{X}(x) F_{Y}(z-x) \mathrm{d}x $$
Now differentiating both sides with respect to $z$ yields the conclusion.
$$ \begin{align*} f_{Z}(z) = \dfrac{\mathrm{d} F_{Z}(z)}{\mathrm{d}z} &= \int\limits_{-\infty}^{\infty} f_{X}(x) \dfrac{\partial F_{Y}(z - x)}{\partial z} \mathrm{d}x \\ &= \int\limits_{-\infty}^{\infty} f_{X}(x) f_{Y}(z-x) \mathrm{d}x \\ &= f_{X} \ast f_{Y} (z) \end{align*} $$
■
Proof using characteristic functions
Denote the characteristic function of the random variable $X$ by $\Phi_{X}$. The characteristic function of $Z$ is computed as follows.
$$ \Phi_{Z}(t) = \mathbb{E}\left[ e^{\mathrm{i} t (X+Y)} \right] = \mathbb{E}\left[ e^{\mathrm{i}tX} e^{\mathrm{i}tY} \right] = \mathbb{E}\left[ e^{\mathrm{i}tX} \right] \mathbb{E}\left[ e^{\mathrm{i}tY} \right] = \Phi_{X}(t) \Phi_{Y}(t) $$
The third equality holds because $X$ and $Y$ are independent.
Inverse Fourier transform theorem
$\hat{f} = \hat{g}$ implies $f = g$.
Properties of the Fourier transform
$$ \widehat{f \ast g} = \hat{f} \cdot \hat{g} $$
Since the characteristic function is the Fourier transform, by the above lemmas we obtain
$$ \widehat{f_{Z}} = \Phi_{Z} = \Phi_{X} \Phi_{Y} = \widehat{f_{X}} \cdot \widehat{f_{Y}} = \widehat{f_{X} \ast f_{Y}} $$
$$ \implies f_{Z} = f_{X} \ast f_{Y} $$
■
Casella. (2001). Statistical Inference(2nd Edition): p215. ↩︎
