logo

Lipschitz Condition 📂Numerical Analysis

Lipschitz Condition

Definition

We can find the Lipschitz condition in the statement of Existence-Uniqueness Theorem for First Order Differential Equations.

For a continuous function defined in DR2D \subset \mathbb{R}^2 with an initial value problem given by {y=f(x,y)y(x0)=Y0\begin{cases} y ' = f(x,y) \\ y( x_{0} ) = Y_{0} \end{cases}, if ff satisfies the Lipschitz condition for all (x,y1),(x,y2)D(x,y_{1}) , (x , y_{2} ) \in D and K>0K > 0, f(x,y1)f(x,y2)Ky1y2 |f(x,y_{1} ) - f(x,y_{2}) | \le K | y_{1} - y_{2} | then there exists a unique solution Y(x)Y(x) in an appropriate interval I:=[x0α,x0+α]I := [ x_{0} - \alpha , x_{0} + \alpha ] for (x0,Y0)D(x_{0} , Y_{0}) \in D^{\circ}.

Explanation

If we translate the Lipschitz condition into a more familiar expression, we can represent it as f(x,y1)f(x,y2)y1y2K \left| { f(x,y_{1} ) - f(x,y_{2}) } \over { y_{1} - y_{2} } \right| \le K . Even in the worst-case scenario, K=max(x,y)Df(x,y)y K = \max_{(x,y) \in D} \left| {{ \partial f(x,y) } \over { \partial y }} \right| , so the boundedness of the derivative of ff becomes a similar condition. This means that there won’t be a sudden change in the function values, at least for the initial value (x0,Y0)( x_{0} , Y_{0} ), indicating that among them, there are problems that are easy to solve.

This condition is necessary to explain the concept of stability. Let’s consider a slight variation δ(x)\delta (x), ϵ\epsilon added to the initial value problem given in the assumption: {y(x;ϵ)=f(x,Y(x;ϵ))+δ(x)Y(x0;ϵ)=Y0+ϵ \begin{cases} y ' (x ; \epsilon) = f(x, Y(x ;\epsilon ) ) + \delta (x) \\ Y( x_{0} ; \epsilon ) = Y_{0} + \epsilon \end{cases} . Though these two problems are mathematically different, if δ| \delta | and ϵ | \epsilon | are sufficiently small and the Lipschitz condition is satisfied, then the following holds.

For a continuous function defined in DR2D \subset \mathbb{R}^2 with an initial value problem given by {y(x;ϵ)=f(x,Y(x;ϵ))+δ(x)Y(x0;ϵ)=Y0+ϵ\begin{cases} y ' (x ; \epsilon) = f(x, Y(x ;\epsilon ) ) + \delta (x) \\ Y( x_{0} ; \epsilon ) = Y_{0} + \epsilon \end{cases}, if ff satisfies the Lipschitz condition, then there exists a unique solution Y(x;δ,ϵ)Y(x ; \delta, \epsilon ) satisfying ϵϵ0| \epsilon | \le \epsilon_{0} and δ | \delta |_{\infty} in an appropriate interval I:=[x0α,x0+α]I := [ x_{0} - \alpha , x_{0} + \alpha ] and sufficiently small ϵ0>0\epsilon_{0} >0 for (x0,Y0)D(x_{0} , Y_{0}) \in D^{\circ}.

The importance of stability in solving differential equations arises when considering numerical approximations. It would be problematic to have to overhaul the entire model every time a slight numerical change occurs with new data being added continuously. Let’s take an example to understand what happens when the Lipschitz condition is not satisfied. The solution to the initial value problem {y=100y101exy(0)=1 \begin{cases} y ' = 100 y - 101 e^{-x} \\ y( 0 ) = 1 \end{cases} is simply obtained as y=exy = e^{-x}. If we change the initial value to y(0)=1+ϵy(0) = 1 + \epsilon, the solution becomes y=ex+ϵe100xy = e^{-x} + \epsilon e^{100x}, and if ϵ| \epsilon | is not considerably small, the error becomes too large. Therefore, the original solution obtained without changing the initial value becomes impractical to use, and in such cases, it is said to be ill-conditioned. Conversely, if for an increasing xx, x0xf(t,Y(t))ydt\displaystyle \int_{x_{0}}^{x} {{ \partial f (t, Y(t) ) } \over {\partial y }} dt is bounded by a small positive number, it is said to be well-conditioned. The set of Lipschitz continuous functions in interval II is also represented as C0,1(I)C^{0,1} ( I ).

See Also

Strong Lipschitz Condition     \implies Lipschitz Condition     \implies Local Lipschitz Condition