Subgradient Method
Definition1
Let’s say the objective function is a convex function. Let’s denote the subgradient of at point as . The method of updating in the following way to solve the optimization problem for is called the subgradient method.
Description2
It’s a form where the gradient in the Gradient Descent is replaced with a subgradient.
If is differentiable, one can use Gradient Descent, so it’s an optimization method chosen when the objective function is not differentiable. However, it has the disadvantage of having a slow convergence rate.