logo

Differentiation of the Absolute Value Function 📂Functions

Differentiation of the Absolute Value Function

Theorem

The derivative of the absolute value function is as follows.

$$ \frac{ d |x| } {d x} = \dfrac{1}{|x|}x = \begin{cases} 1 & x > 0 \\ -1 & x < 0 \end{cases}, \qquad x \neq 0 $$

Explanation

In fact, the absolute value function is not differentiable over the entire set of real numbers because of its sharp point at $x = 0$. However, excluding just one point from its domain, the function becomes differentiable as shown in $\mathbb{R} \setminus \left\{ 0 \right\}$. In other words, as opposed to $f$, the function defined as $g$ has a derivative $g^{\prime}$.

$$ f (x) := |x|, \qquad x \in \mathbb{R} $$ $$ g (x) := |x|, \qquad x \in \mathbb{R} \setminus \left\{ 0 \right\} $$

In many cases, $g^{\prime}$ can be regarded as the derivative of $f$, and this is referred to as the weak derivative of $f$. This is precisely why activation functions like ReLU, which are used in deep learning, can be utilized even though they are not differentiable at $x = 0$.

Proof

$$ \begin{align*} \frac{ d |x| } {d x} &= \frac{d \sqrt{x^2} }{d x} \\ &= \frac{d \sqrt{x^2}}{d x^2} \frac{d x^2}{dx} \\ &= \frac{1}{2}\frac{1}{\sqrt{x^2}} \cdot 2x \\ &= \dfrac{1}{|x|}x \end{align*} $$