logo

Lipschitz Condition 📂Numerical Analysis

Lipschitz Condition

Definition

We can find the Lipschitz condition in the statement of Existence-Uniqueness Theorem for First Order Differential Equations.

For a continuous function defined in $D \subset \mathbb{R}^2$ with an initial value problem given by $\begin{cases} y ' = f(x,y) \\ y( x_{0} ) = Y_{0} \end{cases}$, if $f$ satisfies the Lipschitz condition for all $(x,y_{1}) , (x , y_{2} ) \in D$ and $K > 0$, $$ |f(x,y_{1} ) - f(x,y_{2}) | \le K | y_{1} - y_{2} | $$ then there exists a unique solution $Y(x)$ in an appropriate interval $I := [ x_{0} - \alpha , x_{0} + \alpha ]$ for $(x_{0} , Y_{0}) \in D^{\circ}$.

Explanation

If we translate the Lipschitz condition into a more familiar expression, we can represent it as $$ \left| { f(x,y_{1} ) - f(x,y_{2}) } \over { y_{1} - y_{2} } \right| \le K $$. Even in the worst-case scenario, $$ K = \max_{(x,y) \in D} \left| {{ \partial f(x,y) } \over { \partial y }} \right| $$, so the boundedness of the derivative of $f$ becomes a similar condition. This means that there won’t be a sudden change in the function values, at least for the initial value $( x_{0} , Y_{0} )$, indicating that among them, there are problems that are easy to solve.

This condition is necessary to explain the concept of stability. Let’s consider a slight variation $\delta (x)$, $\epsilon$ added to the initial value problem given in the assumption: $$ \begin{cases} y ' (x ; \epsilon) = f(x, Y(x ;\epsilon ) ) + \delta (x) \\ Y( x_{0} ; \epsilon ) = Y_{0} + \epsilon \end{cases} $$. Though these two problems are mathematically different, if $| \delta |$ and $ | \epsilon |$ are sufficiently small and the Lipschitz condition is satisfied, then the following holds.

For a continuous function defined in $D \subset \mathbb{R}^2$ with an initial value problem given by $\begin{cases} y ' (x ; \epsilon) = f(x, Y(x ;\epsilon ) ) + \delta (x) \\ Y( x_{0} ; \epsilon ) = Y_{0} + \epsilon \end{cases}$, if $f$ satisfies the Lipschitz condition, then there exists a unique solution $Y(x ; \delta, \epsilon )$ satisfying $| \epsilon | \le \epsilon_{0}$ and $ | \delta |_{\infty}$ in an appropriate interval $I := [ x_{0} - \alpha , x_{0} + \alpha ]$ and sufficiently small $\epsilon_{0} >0$ for $(x_{0} , Y_{0}) \in D^{\circ}$.

The importance of stability in solving differential equations arises when considering numerical approximations. It would be problematic to have to overhaul the entire model every time a slight numerical change occurs with new data being added continuously. Let’s take an example to understand what happens when the Lipschitz condition is not satisfied. The solution to the initial value problem $$ \begin{cases} y ' = 100 y - 101 e^{-x} \\ y( 0 ) = 1 \end{cases} $$ is simply obtained as $y = e^{-x}$. If we change the initial value to $y(0) = 1 + \epsilon$, the solution becomes $y = e^{-x} + \epsilon e^{100x}$, and if $| \epsilon |$ is not considerably small, the error becomes too large. Therefore, the original solution obtained without changing the initial value becomes impractical to use, and in such cases, it is said to be ill-conditioned. Conversely, if for an increasing $x$, $\displaystyle \int_{x_{0}}^{x} {{ \partial f (t, Y(t) ) } \over {\partial y }} dt$ is bounded by a small positive number, it is said to be well-conditioned. The set of Lipschitz continuous functions in interval $I$ is also represented as $C^{0,1} ( I )$.

See Also

Strong Lipschitz Condition $\implies$ Lipschitz Condition $\implies$ Local Lipschitz Condition