This tag is for questions relating to subgradient, an iterative method for solving convex minimization problems, used predominantly in Nondifferentiable optimization for functions that are convex but nondifferentiable. The subgradient method is a very simple algorithm for minimizing convex nondifferentiable functions where newton's method and simple linear programming will not work.
The Subgradient (related to Subderivative and Subdifferential) of a function is a way of generalizing or approximating the derivative of a convex function at nondifferentiable points.
Definition : A vector $~g ∈ \mathbb R^n~$is a subgradient of $~f : \mathbb R^n → \mathbb R~$ at $~x ∈ \text{dom}~ f~$ if for all $~z ∈\text{dom}~f~,$ $$f(z) ≥ f(x) + g^T(z − x)~.$$
Note : If $~f~$ is convex and differentiable, then its gradient at $~x~$ is a subgradient. But a subgradient can exist even when $~f~$ is not differentiable at $~x~$.
Subgradient methods are convergent when applied even to a non-differentiable objective function. When the objective function is differentiable, sub-gradient methods for unconstrained problems use the same search direction as the method of steepest descent. Subgradient methods are slower than Newton's method when applied to minimize twice continuously differentiable convex functions. However, Newton's method fails to converge on problems that have non-differentiable kinks.
For more details please visit the following references:
https://see.stanford.edu/materials/lsocoee364b/01-subgradients_notes.pdf
https://people.csail.mit.edu/dsontag/courses/ml16/slides/notes_convexity16.pdf
https://optimization.mccormick.northwestern.edu/index.php/Subgradient_optimization