3

Assume that we have this quadratic function:

$$ J = \frac {1}{2}x^TQx + c^Tx $$

And our gradient is the derivative:

$$ J_g = x^TQ + c^T $$

We want to minimize $J$ and we can do that with setting $J_g = 0$ and solve for $x^T $.

Or we can use gradient descent.

$$x^T_{k+1} = x^T_{k} +\alpha J_g(x_{k})$$

Where $\alpha > 0$ is a small number positive number.

That sounds easy. But how would I do if I want to minimize $J$ with constraints:

$$Ax \leq b $$ $$x \geq 0$$

What would I do then? What method should I use? Can I use if-statements to check when $x$ is outside of the cobstraints?

euraad
  • 3,052
  • 4
  • 35
  • 79

1 Answers1

2

We can reformulate the constraints as \begin{equation} \begin{pmatrix} A \\ -I \end{pmatrix} x \leq \begin{pmatrix} b \\ 0 \end{pmatrix} \end{equation} And denote $\begin{pmatrix} A \\ -I \end{pmatrix}$ by $\bar{A}$, denote $\begin{pmatrix} b \\ 0 \end{pmatrix}$ by $\bar{b}$. Thus the original problem is equivalent to \begin{equation} \begin{array}{cl} {\min} & {\frac{1}{2} x^TQx + c^T x} \\ {\text{s.t.}} & {\bar{A}x \leq \bar{b}} \end{array} \end{equation} To apply the gradient descent to this problem, we need one more step: \begin{equation} \begin{aligned} & \bar{x} = x_k - (Qx_k+c), \\ & x_{k+1} = \operatorname{proj}_{\bar{A}x \leq \bar{b}}(\bar{x}), \end{aligned} \end{equation} where $\operatorname{proj}_{\bar{A}x \leq \bar{b}}(\cdot)$ is projection operator: \begin{equation} \operatorname{proj}_{\bar{A}x \leq \bar{b}}(\bar{x}) = \arg\min_{\bar{A}x \leq \bar{b}} \|x - \bar{x}\|^2. \end{equation} The projection operator can be solved by proximal gradient method.

And one can refer to Quadratic Programming for some other methods (e.g. SQP).

Zenan Li
  • 1,434
  • Is SQP better to use? – euraad Mar 27 '20 at 12:05
  • @DanielMårtensson, in my humble opinion, if the scale of your problem is not very large, the interior point method and SQP is recommended. However, if the scale is very large, maybe you can use the projected gradient descent or semidefinite relaxation method. – Zenan Li Mar 27 '20 at 15:08
  • I think I will use the simplest method first, your suggestion. So all I need to do is to first do gradient descent and then corrects it with projection, which is minimize on specific constraints? How should I minimize with constraints? Using simplex method? – euraad Mar 27 '20 at 15:40
  • I could not understand your second question exactly. But what you need to do is gradient descent and then corrects it with projection. The simplex method is not suitable for this quadratic programming. – Zenan Li Mar 27 '20 at 15:51
  • But how should I minimize the $||x - \bar x||^2 $? I can see that this is typical least square, if I exclude the constraints. – euraad Mar 27 '20 at 16:03
  • Can you show an example ? – euraad Mar 27 '20 at 20:34
  • https://math.stackexchange.com/questions/2210394/projection-of-a-point-onto-a-convex-polyhedra – Zenan Li Mar 28 '20 at 01:05
  • Same answer. Did not help me. – euraad Mar 28 '20 at 20:44