For an unconstrained convex minimization problem $$\begin{align} \text{minimize} \quad f(x) \end{align}$$ it is a well known that for a sufficiently small fixed step size $\alpha$, the gradient descent procedure defined by $$\hat x \equiv x - \alpha \nabla f(x)$$ converges to a minimum.
Now consider a convex minimization problem with a simple sign constraint: $$\begin{align} \text{minimize} \quad &f(x) \\ \text{subject to} \quad &x \geq 0 \end{align}$$ If we apply gradient descent with fixed step size, then some $x_i$ may turn negative. A common way to prevent this is using clamping, i.e. taking $$\hat x_i \equiv \max\lbrace{0, x_i - \alpha \nabla_{x_i} f(x) \rbrace}$$ (Notice that it is distinct from projected gradient descent, which minimizes the Euclidean distance between $\hat x$ and $x - \alpha \nabla f(x)$.)
- Is there a canonical name for this gradient descent with fixed step size and clamping?
- What are the conditions for its convergence?
I took the name "clamping" from this SE question; it doesn't seem to be a widely used term.
In computational economics, there is an analogous gradient process called tatonnement wherein prices adjust in the direction of excess demand but "clamp" to zero if the excess demand is highly negative. This is known to converge if the equivalent of $\nabla f$ satisfies a property called gross substitutability and the step size is sufficiently small. A canonical reference is H. Uzawa, 1960, “Walras’ tâtonnement in the theory of exchange,” The Review of Economic Studies 27, no. 3: 182–94.