1

Given the Scalar Huber Loss Function:

$$ {L}_{\delta} \left( x \right) = \begin{cases} \frac{1}{2} {x}^{2} & \text{for} \; \left| x \right| \leq \delta \\ \delta (\left| x \right| - \frac{1}{2} \delta) & \text{for} \; \left| x \right| > \delta \end{cases} $$

For the vector case one should apply the scalar function in a component wise manner and then sum all components:

$$ {H}_{\delta} \left( x \right) = \sum_{i} {L}_{\delta} \left( {x}_{i} \right) $$

What is the Proximal Operator for the vector function?
Namely what's $ \operatorname{prox}_{\lambda {H}_{\delta} \left( \cdot \right)} \left( y \right) = \arg \min_{x} \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} + \lambda {H}_{\delta} \left( x \right) $?

Could anyone implement it in MATLAB?

Royi
  • 10,050

1 Answers1

2

From @dohmatob's answer to Proximal Operator of the Huber Loss Function we know the solution for the case $ \delta = 1 $:

$$ {\left( \operatorname{prox}_{\lambda {H}_{1} \left( \cdot \right)} \left( y \right) \right)}_{i} = {y}_{i} - \frac{\lambda {y}_{i}}{\max \left( \left| {y}_{i} \right|, \lambda + 1 \right)} $$

Since $ {H}_{\delta} \left( x \right) = {\delta}^{2} {H}_{1} \left( \frac{x}{\delta} \right) $ one could use the Scaling Property of the Proximal Operator:

$$\begin{aligned} \operatorname{prox}_{\lambda {H}_{\delta} \left( \cdot \right)} \left( y \right) & = \operatorname{prox}_{ {\delta}^{2} \lambda {H}_{1} \left( \frac{\cdot}{\delta} \right)} \left( y \right) \\ & = \delta \operatorname{prox}_{ \frac{{\delta}^{2} \lambda}{ {\delta}^{2} } {H}_{1} \left( \cdot \right)} \left( \frac{y}{\delta} \right) \\ & = \delta \operatorname{prox}_{ \lambda {H}_{1} \left( \cdot \right)} \left( \frac{y}{\delta} \right) \end{aligned}$$

Hence it is given by:

$$ {\left( \operatorname{prox}_{\lambda {H}_{\delta} \left( \cdot \right)} \left( y \right) \right)}_{i} = {y}_{i} - \frac{\lambda {y}_{i}}{\max \left( \left| \frac{{y}_{i}}{\delta} \right|, \lambda + 1 \right)} $$

A MATLAB implementation is given in my answer to Proximal Operator of Huber Loss Function (For $ {L}_{1} $ Regularized Huber Loss of a Regression Function).

Royi
  • 10,050
  • Shouldn't it be that the expression for $prox_{\lambda H_\delta} (y)$ has $\delta^2$ instead of $\delta$? – shani Jan 17 '22 at 07:51
  • @shani, Could you be more specific where? – Royi Jan 17 '22 at 14:22
  • I believe this post is linked with the question here https://math.stackexchange.com/questions/1650411/proximal-operator-of-the-huber-loss-function?rq=1. The answer given for the minimization problem for z should have $\mu^2 \text{prox}_{h\sigma}(y/\mu) $. Isn't it? – shani Jan 17 '22 at 14:25
  • $z = \arg\min_{z}\frac{1}{2}|\mu z - y|^2 + \mu^2\sigma h(z) = \mu^2 [\arg\min_{z}\frac{1}{2}|z - y/\mu|^2 + \sigma h(z) ] = \mu^2 \text{prox}_{\sigma h}(y/\mu). $ Isn't it? – shani Jan 17 '22 at 14:27
  • 1
    I couldn't find an error in the derivation above. You may look at the scaling property of the proximal operator. See https://i.imgur.com/rxHtGVc.png. – Royi Jan 18 '22 at 11:49
  • Sorry, I think I have made a mistake in my derivation. I checked it numerically as well. Your answer is correct. – shani Jan 18 '22 at 22:56