logo

Finite Difference Method 📂Numerical Analysis

Finite Difference Method

Definition1 2

The finite difference method is a numerical method for computing derivatives, approximating the derivative as the average rate of change over a short interval.

Explanation

The key to deriving the formula is the Taylor expansion.

$$ f(x+h) = f(x) + f^{\prime}(x)h + \dfrac{f^{\prime \prime}(x)}{2!}h^{2} + \dfrac{f^{\prime \prime \prime}}{3!}h^{3} + \cdots \tag{1} $$

Rearranging to have only the derivative on the left side,

$$ \begin{align*} f^{\prime}(x) &= \dfrac{f(x+h) - f(x)}{h} + \dfrac{f^{\prime \prime}(x)}{2!}h + \cdots \\ &= \dfrac{f(x+h) - f(x)}{h} + \mathcal{O}(h) \\ &\approx \dfrac{f(x+h) - f(x)}{h} \end{align*} $$

Here, $\mathcal{O}(h)$ is asymptotic notation. If $h$ is sufficiently small, the right side will approximate the left side quite well. In fact, derivatives or acceleration are often explained as the limit of the average rate of change when first learning.

Derivatives

Forward Difference

The derivative at $x$ is approximated by the average rate of change between $x$ and the next timestep $x+h$, which is called the forward difference. The error term is $\mathcal{O}(h)$.

$$ f^{\prime}(x) \approx \dfrac{f(x+h) - f(x)}{h} $$

Backward Difference

The derivative at $x$ is approximated by the average rate of change between $x$ and the previous timestep $x-h$, which is called the backward difference. The error term is $\mathcal{O}(h)$.

$$ f^{\prime}(x) \approx \dfrac{f(x) - f(x-h)}{h} $$

Central Difference

The central difference, unlike the previous two methods, has an error term of $\mathcal{O}(h^{2})$. By substituting $\pm\frac{h}{2}$ for $h$ in $(1)$,

$$ f(x+h/2) = f(x) + f^{\prime}(x)\frac{h}{2} + \dfrac{f^{\prime \prime}(x)}{2!}\left(\frac{h}{2}\right)^{2} + \dfrac{f^{\prime \prime \prime}(x)}{3!}\left(\frac{h}{2}\right)^{3} + \cdots $$

$$ f(x-h/2) = f(x) - f^{\prime}(x)\frac{h}{2} + \dfrac{f^{\prime \prime}(x)}{2!}\left(\frac{h}{2}\right)^{2} - \dfrac{f^{\prime \prime \prime}(x)}{3!}\left(\frac{h}{2}\right)^{3} + \cdots $$

Subtracting the equation below from the above,

$$ f(x+h/2) - f(x-h/2) = f^{\prime}(x)h + 2\dfrac{f^{\prime \prime \prime}(x)}{3!}\left(\frac{h}{2}\right)^{3} + \cdots $$

Rearranging for $f^{\prime}(x)$,

$$ \begin{align*} f^{\prime}(x) &= \frac{f(x+h/2) - f(x-h/2)}{h} + \dfrac{f^{\prime \prime \prime}(x)}{3!}\left(\frac{h}{2}\right)^{2} + \cdots \\ &= \frac{f(x+h/2) - f(x-h/2)}{h} + \mathcal{O}(h^{2}) \\ &\approx \frac{f(x+h/2) - f(x-h/2)}{h} \end{align*} $$

Second-order Derivatives

Adding the two equations of $(1)$ and substituting $-h$ for $h$ yields the following.

$$ \begin{align*} && f(x+h) &= f(x) + f^{\prime}(x)h + \dfrac{f^{\prime \prime}(x)}{2!}h^{2} + \dfrac{f^{\prime \prime \prime}}{3!}h^{3} + \cdots \\ {+} && f(x-h) &= f(x) - f^{\prime}(x)h + \dfrac{f^{\prime \prime}(x)}{2!}h^{2} - \dfrac{f^{\prime \prime \prime}}{3!}h^{3} + \cdots \\ \hline && f(x+h) + f(x-h) &= 2f(x) + f^{\prime \prime}(x)h^{2} + 2\dfrac{f^{\prime \prime \prime \prime}}{4!}h^{4} + \cdots \end{align*} $$

Rearranging for $f^{\prime \prime}$,

$$ \begin{align*} f^{\prime \prime}(x) &= \frac{f(x+h) - 2f(x) + f(x-h)}{h^{2}} + 2\dfrac{f^{\prime \prime \prime \prime}}{4!}h^{2} + \cdots \\ &= \frac{f(x+h) - 2f(x) + f(x-h)}{h^{2}} + \mathcal{O}(h^{2}) \\ &\approx \frac{f(x+h) - 2f(x) + f(x-h)}{h^{2}} \end{align*} $$

Ignoring the error term, this equation can also be derived by applying the forward difference method and then the backward difference method in turn to $f^{\prime}$.

$$ \begin{align*} f^{\prime \prime}(x) &\approx \dfrac{f^{\prime}(x+h) - f^{\prime}(x)}{h} \\ &\approx \dfrac{\dfrac{f(x+h)-f(x)}{h} - \dfrac{f(x)-f(x-h)}{h}}{h} \\ &= \dfrac{f(x+h) - 2f(x) + f(x-h)}{h^{2}} \end{align*} $$

See Also


  1. A. Iserles, A First Course in the Numerical Analysis of Differential Equations (2nd, 2009), p139-141 ↩︎

  2. Mykel J. Kochenderfer, Algorithms for Optimization (2019), p23-26 ↩︎