Ramp Function
Definition
The following function is called a ramp function.
$$ R(x) := \begin{cases} x & x \gt 0 \\ 0 & x \le 0 \end{cases} $$
Various Definitions1
It can be defined in several ways as follows.
$$ \begin{align*} R(x) &:= \begin{cases} x & x \gt 0 \\ 0 & x \le 0 \end{cases} \\[1em] &= \max \left\{ 0, x \right\} \\[1em] &= x H(x) \\[1em] &= \dfrac{x + \left| x \right|}{2} \\[1em] &= H(x) \ast H(x) \\[1em] &= \int_{-\infty}^{x} H(\xi) d\xi \end{align*} $$
$H(x)$ refers to the unit step function, and $\ast$ refers to convolution.
Explanation
This is not about the residence of a fairy, a lamp, but rather a ramp. The function is named because its graph looks like a ramp.
It is a function commonly used in electrical engineering and signal processing and is referred to as a ramp filter or ramp signal. In machine learning, it is called ReLU. This document will discuss the ramp function itself, and its significance as an activation function is described in the ReLU document.
Properties
The derivative of a ramp function is a unit step function. Since the derivative of the unit step function is the Dirac delta function, the second derivative of the ramp function is the delta function.
$$ R^{\prime}(x) = H(x) \quad \text{and} \quad R^{\prime \prime}(x) = H^{\prime}(x) = \delta (x) $$