Rosenbrock Method for Solving Differential Equations
Method 1
Given the ordinary differential equation (ODE) $\dot{y} = f(y)$, the following update rule is called the $s$-stage Rosenbrock method. $$ \begin{align*} k_{i} =& h f \left( y_{n} + \sum_{j=1}^{i-1} \alpha_{ij} k_{j} \right) + h J \sum_{j=1}^{i} \gamma_{ij} k_{j} , & i = 1 , \cdots , s \\ y_{n+1} =& y_{n} + \sum_{i=1}^{s} b_{i} k_{i} \end{align*} $$ Here, $J$ is the Jacobian of $f$ with respect to $y_{n}$. $\alpha_{ij}$, $\gamma_{ij}$, $b_{i}$ are predetermined coefficients.
Explanation
In simple terms, it is an implicit Runge–Kutta method that incorporates the idea of linearization. Unlike the commonly used RK4 for solving ODEs, it is primarily used to solve stiff problems2.
Implicit Runge–Kutta method: $$ \begin{align*} \xi_{j} =& y_{n} + h\sum_{i=1}^{\nu}a_{j,i}f(t_{n} + c_{i}h, \xi_{i}),\quad j=1,\dots,\nu \\ y_{n+1} =& y_{n} + h\sum_{j=1}^{\nu} b_{j} f(t_{n} + c_{j}h, \xi_{j}),\quad n=0,1,2,\dots \end{align*} $$
The idea of linearization is evident from the Jacobian $J$, so let’s look at an example for $s = 1 , 2$ to get an intuition for why this is an implicit method.
ROS1 (1-stage Rosenbrock method)
$$ \begin{align*} k_{1} =& \left( I - \gamma h J \right)^{-1} h f \left( y_{n} \right) \\ y_{n+1} =& y_{n} + k_{1} \end{align*} $$
By the definition of $k_{i}$, $k_{i}$ appears on both sides, so solving it entails solving a matrix equation. As the number of equations increases, the computational cost grows, so one can infer that Rosenbrock methods are inherently not well suited for very high-dimensional problems.
ROS2 (2-stage Rosenbrock method)
$$ \begin{align*} k_{1} =& \left( I - \gamma h J \right)^{-1} h f \left( y_{n} \right) \\ k_{2} =& \left( I - \gamma h J \right)^{-1} h f \left( y_{n} + k_{1} \right) \\ y_{n+1} =& y_{n} + b_{1} k_{1} + b_{2} k_{2} \end{align*} $$
Even from the two-stage case one can see that Rosenbrock methods are fundamentally a type of Runge–Kutta method.
