Subgradient Method
Definition1
Let’s say the objective function $f : \mathbb{R}^{n} \to \mathbb{R}$ is a convex function. Let’s denote the subgradient of $f$ at point $\mathbf{x}^{(k)}$ as $\mathbf{g}^{(k)}$. The method of updating $\mathbf{x}^{(k)}$ in the following way to solve the optimization problem for $f$ is called the subgradient method.
$$ \mathbf{x}^{(k+1)} = \mathbf{x}^{(k)} - \alpha \mathbf{g}^{(k)} $$
Description2
It’s a form where the gradient in the Gradient Descent is replaced with a subgradient.
$$ \text{gradient descent: } \mathbf{x}^{(k+1)} = \mathbf{x}^{(k)} - \alpha \nabla f(\mathbf{x}^{(k)}) $$
If $f$ is differentiable, one can use Gradient Descent, so it’s an optimization method chosen when the objective function is not differentiable. However, it has the disadvantage of having a slow convergence rate.