logo

Adams Method 📂Numerical Analysis

Adams Method

Definition 1

Multistep methods: Given a continuous function $D \subset \mathbb{R}^2$ for $f$ and an initial value problem $\begin{cases} y ' = f(x,y) \\ ( y( x_{0} ) , \cdots , y(x_{p}) ) = (Y_{0}, \cdots , Y_{p} ) \end{cases}$. Let’s divide interval $(a,b)$ into nodes like $a \le x_{0} < x_{1} < \cdots < x_{n} < \cdots x_{N} \le b$. Especially, for a sufficiently small $h > 0$, if we say $x_{j} = x_{0} + j h$, a method is called an $(p+1)$-step method if it satisfies $a_{p} \ne 0$ or $b_{p} \ne 0$ with respect to the initial value and $0 \le p \le m$. $$ y_{n+1} = \sum_{j=0}^{p} a_{j} y_{n-j} + h \sum_{j = -1}^{p} b_{j} f (x_{n-j} , y_{n-j} ) $$

Adams-Bashforth Method

For the coefficients $\displaystyle \gamma_{j} := {{1} \over {j!} } \int_{0}^{1} s (s +1) \cdots (s+ j - 1) dx$, $$ y_{n+1} = y_{n} + h \sum_{j=0}^{p} \gamma_{j} \Delta^{j} y_{n}' $$ is called the ($p$th order) Adams-Bashforth Method.

  1. $p=0$ : $$y_{n+1} = y_{n} + h y_{n} ' $$
  2. $p=1$ : $$y_{n+1} = y_{n} + {{h} \over {2}} ( 3 y_{n}' - y_{n-1}' )$$
  3. $p=2$ : $$y_{n+1} = y_{n} + {{h} \over {12}} ( 23 y_{n}' - 16 y_{n-1}' + 5 y_{n-2}' )$$
  4. $p=3$ : $$y_{n+1} = y_{n} + {{h} \over {24}} ( 55 y_{n}' - 59 y_{n-1}' + 37 y_{n-2}' - 9 y_{n-3}' )$$

Adams-Moulton Method

For the coefficients $\displaystyle \delta_{j} := {{1} \over {j!} } \int_{0}^{1} (s - 1 ) s (s +1) \cdots (s+ j - 2) dx$, $$ y_{n+1} = y_{n} + h \sum_{j=0}^{p} \delta_{j} \nabla^{j} y_{n+1}' $$ is called the ($p$th order) Adams-Moulton Method.

  1. $p=0$ : $$y_{n+1} = y_{n} + h y_{n} ' $$
  2. $p=1$ : $$y_{n+1} = y_{n} + {{h} \over {2}} ( y_{n+1}' + y_{n}' )$$
  3. $p=2$ : $$y_{n+1} = y_{n} + {{h} \over {12}} ( 5 y_{n+1}' + 8 y_{n}' - y_{n-1}' )$$
  4. $p=3$ : $$y_{n+1} = y_{n} + {{h} \over {24}} ( 9 y_{n+1}' + 19 y_{n}' -5 y_{n-1}' + y_{n-2}' )$$

Description

The Adams methods, as multistep methods, are commonly used in predictor-corrector algorithms, varying the step size $h$ and the order of the method. For example, using the $1$th order Adams-Bashforth method as a predictor and the $2$th order Adams-Moulton method as a corrector is essentially the same algorithm as using the Euler method as a predictor and a trapezoidal method as a corrector.

Generally, the Adams-Moulton method is not only less erroneous than the Adams-Bashforth method but also more stable, which can be easily understood by comparing their structures. Firstly, the Adams-Bashforth method is an explicit method, whereas the Adams-Moulton method is an implicit method. The predictor-corrector algorithm solving ODEs mixes these Adams methods appropriately.


  1. Atkinson. (1989). An Introduction to Numerical Analysis(2nd Edition): p385~388. ↩︎