logo

Consistency and Order of Convergence of Multistep Methods 📂Numerical Analysis

Consistency and Order of Convergence of Multistep Methods

Theorem

The necessary and sufficient condition for a multistep method $\begin{cases} y ' = f(x,y) \\ ( y( x_{0} ) , \cdots , y(x_{p}) )= ( Y_{0} , \cdots , Y_{p} ) \end{cases}$ to be consistent is (i), and the necessary and sufficient condition to have convergence order $m \in \mathbb{N}$ is (ii).

  • (i): $$\begin{cases} \displaystyle \sum_{j = 0}^{p} a_{j} = 1 \\ \displaystyle - \sum_{j = 0}^{p} j a_{j} + \sum_{j = -1}^{p} b_{j} = 1 \end{cases}$$
  • (ii): For $i = 0, 1 , \cdots , m$, $$\sum_{j=0}^{p} (-j)^{i} a_{j} + i \sum_{j=-1}^{p} (- j )^{i-1} b_{j} = 1$$

Explanation

It is more appropriate to consider multistep methods as a form encompassing numerous methods rather than any single method. This theorem is useful for checking consistency and convergence order across various methods, and since both (i) and (ii) are necessary and sufficient conditions, it can also be used to understand when a method is meaningful or how to increase its convergence speed.

Proof 1

(i)

The truncation error $\displaystyle T_{n} (Y) := Y_{n+1} - \sum_{j=0}^{p} a_{j} Y_{n-j} + h \sum_{j = -1}^{p} b_{j} Y’_{n-j}$ is linear. That is, $$ T_{n} ( \alpha Y + \beta W ) = \alpha T_{n} (Y) + \beta T_{n} (W) $$


Part 1.

Expanding the solution to the initial value problem $Y(x)$ in terms of $x_{n}$ up to $m$ order Taylor expansion, we get $$ Y(x) = \sum_{i=0}^{m} {{1} \over {i! }} (x - x_{n} )^{i} Y^{ ( i ) } ( x_{n} ) + R_{m+1} (x) $$ Expressing it in terms of the truncation error, we have $$ T_{n} (Y) = \sum_{i=0}^{m} {{1} \over {i! }} Y^{ ( i ) } ( x_{n} ) T_{n} \left( (x - x_{n} )^{i} \right) + T_{n} \left( R_{m+1} \right) $$


Part 2.

Let’s calculate $T_{n} \left( (x - x_{n} )^{i} \right)$.

  • Case 1. $i = 0$
    $$ \displaystyle T_{n} (1) = 1 - \sum_{j=0}^{p} a_{j} 1 + h \sum_{j = -1}^{p} b_{j} ( 1 ) ' = 1 - \sum_{j=0}^{p} a_{j} $$ Here, let $\displaystyle c_{0} : = 1 - \sum_{j=0}^{p} a_{j}$.
  • Case 2. $i \ge 1$
    $$ \begin{align*} \displaystyle T_{n} \left( ( x - x_{n})^{i} \right) =& (x_{n+1} - x_{n})^{i} - \left[ \sum_{j=0}^{p} a_{j} ( x_{n-j} - x_{n} )^{i} + h \sum_{j = -1}^{p} b_{j} i ( x_{n-j} - x_{n} )^{i-1} \right] \\ =& h^{i} - \left[ h^{i} \sum_{j=0}^{p} a_{j} ( -j )^{i} + h^{i} i \sum_{j = -1}^{p} b_{j} ( -j )^{i-1} \right] \\ =& h^{i} \left[ 1 - \left( \sum_{j=0}^{p} ( -j )^{i} a_{j} + i \sum_{j = -1}^{p} ( -j )^{i-1} b_{j} \right) \right] \end{align*} $$ Here, let $\displaystyle c_{i} : =1 - \left( \sum_{j=0}^{p} ( -j )^{i} a_{j} + i \sum_{j = -1}^{p} ( -j )^{i-1} b_{j} \right)$ be.

In any case of $i$, we obtain the following. $$ T_{n} (Y) = \sum_{i=0}^{m} {{c_{i} h^{i} } \over {i! }} Y^{ ( i ) } ( x_{n} ) + T_{n} \left( R_{m+1} \right) $$


Part 3.

For the multistep method to be consistent, $$ \lim_{h \to 0} \max_{x_{p} \le x_{n} \le b} \left| {{T_{n}} \over {h}} (Y) \right| = 0 $$ it has to satisfy $T_{n} (Y) = O(h^2)$. From Part 2, $$ T_{n} (Y) = \sum_{i=0}^{m} {{c_{i} h^{i} } \over {i! }} Y^{ ( i ) } ( x_{n} ) + T_{n} \left( R_{m+1} \right) $$ was established, so it suffices if $m=1$ till $c_{0} = c_{1} = 0$. Summing up, $$ \begin{cases} \displaystyle \sum_{j = 0}^{p} a_{j} = 1 \\ \displaystyle - \sum_{j = 0}^{p} j a_{j} + \sum_{j = -1}^{p} b_{j} = 1 \end{cases} $$

(ii)

Part 4. Since $$ R_{m+1} = {{1} \over { (m+1)! }} (x - x_{n} )^{m+1} Y^{(m+1)} (x_{n}) + \cdots = O (h^{m+1} ) $$, $$ T_{n} ( R_{m+1} ) = {{ c_{m+1} } \over { (m+1)! }} h^{m+1} Y^{(m+1)} (x_{n}) + O (h^{m+2} ) $$ For the multistep method to have convergence order $m$, $$ \max_{x_{p} \le x_{n} \le b} \left| {{T_{n}} \over {h}} (Y) \right| = O ( h^{m} ) $$ thus, it suffices to satisfy $T_{n} (Y) = O(h^{m+1})$. This simply requires that for $i = 0,1, \cdots , m$, $c_{i} = 0$ is satisfied. In summary, for $i = 0, 1 , \cdots , m$, $$ \sum_{j=0}^{p} (-j)^{i} a_{j} + i \sum_{j=-1}^{p} (- j )^{i-1} b_{j} = 1 $$


  1. Atkinson. (1989). An Introduction to Numerical Analysis(2nd Edition): p358~359. ↩︎