I am trying to solve the following constrained optimization problem:
$$ \begin{array}{rl} \underset{V \in \Bbb R^{D \times d}}{\operatorname{minimize}} & \operatorname{trace} \left( W V^\top \right) \\ \text{such that} & \frac12 x^\top VFV^\top x = c \end{array}.$$
where $W \in \Bbb R^{D \times d}$ (where $D>d)$ is known, $c$ is a constant, $x$ is a known $D \times 1$ vector, and $F$ is known matrix with shape $d \times d$.
The Lagrangian is given by: $$ L = \operatorname{trace}(WV^\top) + \frac{\lambda}{2} (x^\top VFV^\top x - c), $$ and by setting the gradient of the Lagrangian w.r.t. $V$ to zero, we have: $$ \nabla_V L = W + \lambda x x^\top VF = 0\\ \Rightarrow xx^\top V = -\frac{1}{\lambda}WF^{-1}. $$ The problem is that the $D \times D$ matrix $xx^\top$ is not invertible, which makes the gradient of Lagrangian can not be set to $0$, can anyone tell me:
- Why the method of Lagrangian multiplier fails in this case?
- Is using the pseudoinverse of $xx^\top$ to solve for $V$ is reasonable alternative in this case?
Thanks a lot!