logo

Delta Method in Mathematical Statistics 📂Mathematical Statistics

Delta Method in Mathematical Statistics

Theorem

Let’s assume that the constant $\theta \in \mathbb{R}$ and the sequence of random variables $\left\{ Y_{n} \right\}_{n \in \mathbb{N}}$ converge in distribution to a normal distribution $\sqrt{n} \left( Y_{n} - \theta \right)$ following $N \left(0, \sigma^{2} \right)$.

$1$-Order Delta Method 1

If $g ' (\theta) \ne 0$ exists, $$ \sqrt{n} \left[ g \left( Y_{n} \right) - g(\theta) \right] \overset{D}{\to} N \left( 0, \sigma^{2} \left[ g ' (\theta) \right]^{2} \right) $$

$2$-Order Delta Method 2

If $g ' (\theta) = 0$ and $g''(\theta) \ne 0$ exist, $$ n \left[ g \left( Y_{n} \right) - g(\theta) \right] \overset{D}{\to} \sigma^{2} {{ g''\left( \theta \right) } \over { 2 }} \chi_{1}^{2} $$

Generalized Delta Method

Let’s assume some $k_{n}$ dependent on $n$ as $k_{n} \left( Y_{n} - \theta \right) \overset{D}{\to} Y$. If

  • For $k \in [r-1]$, $g^{(k)} (\theta) = 0$
  • $g^{(r)} (\theta) \ne 0$ exists
  • $g^{(r)}$ is continuous in $\theta$

Then, $$ k_{n}^{r} \left[ g \left( Y_{n} \right) - g (\theta) \right] \overset{D}{\to} {{ g^{(r)} (\theta) } \over { r! }} Y^{r} $$


Explanation

The Delta Method is widely used as an auxiliary theorem to explain many convergences in distribution in mathematical statistics.

Example

For instance, knowing the mean and variance of $X$, $$ g(X) = g (\mu) + g ' (\mu) \left( X - \mu \right) $$ then, $$ \begin{align*} E g(X) \approx & g (\mu) \\ \operatorname{Var} g (X) \approx & \left[ g ' (\mu) \right]^2 \operatorname{Var} X \end{align*} $$ If one wonders about the mean and variance of the inverse of $X$, the following result can be obtained for the function $g(x) := 1/x$. $$ \begin{align*} E {{ 1 } \over { X }} \approx & {{ 1 } \over { \mu }} \\ \operatorname{Var} {{ 1 } \over { X }} \approx & \left[ {{ 1 } \over { \mu }} \right]^{4} \operatorname{Var} X \end{align*} $$ Although this is not precisely the result of the Delta Method, it should remind us of the Delta Method as a tool that allows handling the function form of random variables.

Proof

Strategy: Essentially ends with Taylor expansion and Slutsky’s theorem.

Slutsky’s theorem: For a constant $a,b$ and a random variable $A_{n}, B_{n} ,X_{n} , X$, if $a_{n} \overset{P}{\to} a $, $ B_{n} \overset{P}{\to} b $, and $ X_{n} \overset{D}{\to} X $, then $$ A_{n} + B_{n} X_{n} \overset{D}{\to} a + b X $$

The proof of the generalized Delta Method is omitted.

Proof of the $1$-Order Delta Method

Near $Y_{n} = \theta$, $g \left( Y_{n} \right)$ has a remainder term $R$ where $\lim_{Y_{n} \to \theta} R \to 0$, $$ g \left( Y_{n} \right) = g (\theta) + g ' (\theta) \left( Y_{n} - \theta \right) + R $$ Moving $g(\theta)$ to the left side and multiplying by $\sqrt{n}$, $$ \sqrt{n} \left[ g \left( Y_{n} \right) - g(\theta) \right] \approx g ' (\theta) \sqrt{n} \left( Y_{n} - \theta \right) $$ and the proof ends according to Slutsky’s theorem.

Proof of the $2$-Order Delta Method

Similarly, near $Y_{n} = \theta$, $g \left( Y_{n} \right)$ has a remainder term $R$ where $\lim_{Y_{n} \to \theta} R \to 0$, $$ \begin{align*} g \left( Y_{n} \right) =& g (\theta) + g ' (\theta) \left( Y_{n} - \theta \right) + {{ g''(\theta) } \over { 2 }} \left( Y_{n} - \theta \right)^{2} + R \\ =& g (\theta) + 0 \cdot \left( Y_{n} - \theta \right) + {{ \sigma^{2} } \over { \sigma^{2} }} {{ g''(\theta) } \over { 2 }} \left( Y_{n} - \theta \right)^{2} + R \end{align*} $$ Again, moving $g(\theta)$ to the left side and multiplying by $n$, $$ n \left[ g \left( Y_{n} \right) - g (\theta) \right] \approx \sigma^{2} {{ g''(\theta) } \over { 2 }} {{ \left( Y_{n} - \theta \right)^{2} } \over { \sigma^{2}/n }} $$ and the proof concludes with the chi-squared distribution $\chi_{1}^{2}$ converging in distribution to the square of a standard normal distribution, according to Slutsky’s theorem.


  1. Casella. (2001). Statistical Inference(2nd Edition): p242. ↩︎

  2. Casella. (2001). Statistical Inference(2nd Edition): p244. ↩︎