4

Show that (Some form of Rayleigh Quotient):

$$ \min_{x : \left\| x \right\| = 1} {x}^{*} {A}^{*} A x = {\lambda}_{1}^{2} $$

Where $ {\lambda}_{1} $ is the smallest singular value of $ A \in \mathbb{C}^{n \times n}$.


My attempt:

Let us define $ B = {A}^{*} A $, and say that $ \left( x, {\lambda}_{i}^{2} \right) $ is an eigen-pair of $ B $, i.e., $ B x = {\lambda}_{i}^{2} x $, such that the objective of the above optimization problem can be expressed as

\begin{align} {x}^{*} \underbrace{B x}_{ ={\lambda}_{i}^{2} x } = {x}^{*} {\lambda}_{i}^{2} {x} = {\lambda}_{i}^{2} \hspace{-16mm}\underbrace{ {\left\| x \right\|}^{2} }_{\hspace{19mm}=1 \\ \text{due to the constraint that } \left\| x \right\|=1} = {\lambda}_{i}^{2}. \end{align}

Also, $ B $ is Hermitian. So, the eigenvalues of $ B $ are non-negative and real-valued. Thus, the minimization problem boils down to the search over the minimum eigenvalue $ \min \{ {\lambda}_{i}^{2} \} $ which will render the smallest eigenvalue that can be said as $ {\lambda}_{1}^{2} $.

Do you experts agree with this approach?


Can this be solved in some other ways, e.g., classical Lagrange multiplier based?

Thank you so much in advance.

Royi
  • 10,050
user550103
  • 2,773
  • 1
    Related to: https://math.stackexchange.com/questions/433169 – Royi Jun 02 '18 at 10:26
  • 1
    Pay attention that this is not a Convex Problem (Though it has a unique minimizer). – Royi Jun 02 '18 at 10:32
  • This is a quadratic optimization problem. Right? This quadratic form is an ellipsoid, since $B \succeq 0$. So, geometrically it is convex!? or Am I confusing myself? – user550103 Jun 02 '18 at 14:02
  • The objective function is indeed convex. Yet the constraint is not (Non Linear equality constraint). – Royi Jun 02 '18 at 14:07
  • The constraint can be seen as an $\ell_2$-norm. Instead of equality, if we say the inequality, i.e., $\left|\mathbf{x}\right| \leq 1$, then would the solution be quite different? – user550103 Jun 02 '18 at 14:26
  • Of course it will :-). For instance take $ x = \boldsymbol{0} $. It obeys $ \left| x \right| \leq 1 $ and it certainly minimizes the objective function. – Royi Jun 02 '18 at 14:28
  • I assumed that $\mathbf{x} \neq \mathbf{0}$ (perhaps that assumption should be explicitly mentioned) :) – user550103 Jun 02 '18 at 14:31
  • 1
    Then again the constraint are not convex and hence the problem isn't not convex. Anyhow, this problem is known to be non convex yet still is has unique minimizer. – Royi Jun 02 '18 at 14:37
  • Thank you for the information. I really appreciate it. – user550103 Jun 02 '18 at 14:40
  • I cant see any mistakes in your argument. – Red shoes Jun 04 '18 at 03:56
  • @Royi is there a proof why this is not convex optimization? I know it is not but looking for hints why. What about the constraint makes it non-convex? – Mona Jalal Apr 07 '19 at 20:48
  • 1
    @MonaJalal yes, the equality constraint of a norm makes it a nonconvex, e.g, see https://math.stackexchange.com/questions/1301585/why-is-the-constraint-w-1-non-convex . – user550103 Apr 07 '19 at 20:58

2 Answers2

4

I'll illustrate "Option I". Let $ x\in \mathbb{R}^n$ and $A$ be $n \times n$ (real) symmetric matrix. The problem is $$ \min_{x} x'Ax $$ s.t. $ \| x \| = x'x = 1$. Constructing the Lagrange function you have to minimize the following expression $$ \min_x \left( x'A x+\lambda (x'x - 1) \right). $$ Taking derivative w.r.t. $x$ vector you have $$ \frac{\partial }{\partial x}\left( x'A x+\lambda (x'x - 1) \right) = 2Ax + 2\lambda x = 0, $$ or $$ Ax = \lambda x. $$
Namely, the vectors $x$ that minimize Rayleigh's quantity are those that satisfy this system of linear equations, i.e., the eigenvectors of $A$. Now, you have to choose among all the eigenvectors ($n$ at most). As such, note that if $x$ is an eigenvector then $Ax = \lambda x$, i.e., $x'Ax$ becomes $x'\lambda x = \lambda x'x = \lambda \| x\| = \lambda$. Namely, you just have to choose the eigenvector that corresponds to the minimal eigenvalue and then $ \min_{x} = x'Ax/\|x\| = \lambda_n$.

V. Vancak
  • 16,927
  • Could you please show how you proceed to derive the dual from this primal? – Mona Jalal Apr 07 '19 at 21:05
  • 1
    Can you please state the question in more basic terms? (Im not familiar that much with optimization, hence Im not sure what exactly you are asking) – V. Vancak Apr 07 '19 at 21:23
  • well, you already have written the Lagrangian to derive the dual but you discontinued. $\min_x \left( x'A x+\lambda (x'x - 1) \right)$ I wonder, given the primal of min xTBx st ||x||=1, how can I derive its dual optimization problem using Lagrangian and prove strong duality? – Mona Jalal Apr 07 '19 at 21:26
3

I will sketch 2 approaches for the solution.

Option I

  1. Write the Lagrange of the problem and show that the dual variable must be an eigenvalue.
  2. Show that the Eigenvector of the Eigenvalue achieves the minimum value and hence the problem is solved.

Option II

Define $ B = {A}^{T} A $ which is PSD which means it has Eigen Decomposition $ B = {V}^{T} D V $ where $ V $ is Unitary matrix and $ D $ is diagonal matrix.

So the problem:

$$\begin{align*} \min_{x} \quad & {x}^{T} B x \\ \text{subject to} \quad & \left\| x \right\| = 1 \end{align*}$$

Becomes:

$$\begin{align*} \min_{x} \quad & {y}^{T} D y \\ \text{subject to} \quad & \left\| y \right\| = 1 \end{align*}$$

Where $ y = V x $.
Since $ V $ is Unitary which preserves $ {L}_{2} $ norm the constraint $ \left\| y \right\| = 1 $ is equivalent of $ \left\| x \right\| = 1 $.

Now, if you think about it, $ x $ like selecting and weighing columns of $ V $.
It is only logical it will select the column which matches the lowest value of $ D $ which is exactly the pair of Eigenvector and the smallest Eigenvalue.

user550103
  • 2,773
Royi
  • 10,050