9

Given rank-$1$ square matrices $A_1, A_2, \dots, A_n$, determine if there exists $x \in \mathbb R_{>0}^n$ such that

$$ \sum_{i=1}^n x_i A_i = x_1 A_1 + \cdots + x_n A_n $$

is singular, or decide whether none exists (i.e., the positive linear combination is nonsingular for all positive $x$). Is there a well-known method or algorithm to do this?

M.A
  • 450
  • Is the size of the $A_k$ somehow constrained by $n$? – Calle Apr 21 '15 at 19:32
  • No, but to have a meaningful situation we need to have $n \ge m$ (where the matrices are of size $m\times m$) – M.A Apr 21 '15 at 22:00
  • This is interesting (and I tried, but failed, to offer an answer) but to the best of my knowledge it is not convex optimization. – Michael Grant Apr 22 '15 at 12:48

2 Answers2

4

Here is a partial answer, and perhaps someone will be able to complete it.

Suppose that $A\in\mathbb{R}^{m\times m}$. Because each $A_i$ is rank 1, it can be written as a dyad $A_i=f_i g_i^T$, where $f_i$ and $g_i$ are vectors. Then $$\sum_i x_i A_i = \sum_i x_i f_i g_i^T = F X G^T$$ where $X\triangleq\mathop{\textrm{diag}}(x)\in\mathbb{R}^{n\times n}$, and $F,G\in\mathbb{R}^{m\times n}$ collect the vectors $f_i$, $g_i$, respectively, as columns. Let $$r_F\triangleq\mathop{\textrm{rank}}(F)\leq\min\{m,n\}, \quad r_G\triangleq\mathop{\textrm{rank}}(G)\leq\min\{m,n\}.$$ Since $X$ is full rank, then we have $$\mathop{\textrm{rank}}(FXG^T)\leq\min\{r_F,r_G\}\leq m$$ So if either $F$ or $G$ has a rank of less than $m$, you're done. In particular, this implies that $n\geq m$ if this is going to have a non-trivial answer.

If not, you have more work to do. And that's where I am personally stuck.

Let $(U_F,\Sigma_F,V_F)$ and $(U_G,\Sigma_G,V_G)$ be economy-sized SVDs of $F$ and $G$, respectively. This means that $$U_F\in\mathbb{R}^{m\times r_F} \quad \Sigma_F\in\mathbb{R}^{r_F\times r_F} \quad V_F\in\mathbb{R}^{r_F\times n}$$ $$U_G\in\mathbb{R}^{m\times r_G} \quad \Sigma_G\in\mathbb{R}^{r_G\times r_G} \quad V_G\in\mathbb{R}^{r_G\times n}$$ Then $$FXG^T=U_F\Sigma_FV_F^TXV_G\Sigma_GU_G^T$$ It's not difficult to see that the unknown here is the rank of the $r_F\times r_G$ matrix $V_F^TXV_G$. Note that this is not an SVD, though it looks almost like one. Indeed, it is just a (possibly) reduced version of the very problem you began with!

I know this isn't a complete answer but I figured I shouldn't let the effort go to waste. If someone else can be inspired on this to finish the task, even the OP, then by all means, I look forward to voting them up.

Michael Grant
  • 20,110
  • 1
    Thank you for the insight. But a minor remark: the matrices are square. – M.A Apr 22 '15 at 13:34
  • That certainly simplifies things a bit. In particular, this means that both of the SVDs produce square U matrices. I do not believe however that it helps us get any closer to a solution. – Michael Grant Apr 22 '15 at 13:36
  • Of course, the fact that you were asking about singularity implies that A is square. I'll go ahead and edit my answer to correct that. – Michael Grant Apr 22 '15 at 13:47
  • A vague idea: since the matrices are of rank one, then their traces are nonzero. Assume now that they have mixed trace signs. For instance, let $tr(A_1)>0$ and $tr(A_2)<0$. therefore, for $x_1=[1,0,...,0]^T$ the resulting sum has a positive trace and a positive eigenvalue, and by continuity, the same holds for a neighborhood of $x_1$. For $x_2=[0,1,0,..0]^T$ and neighborhood the resulting sum has a negative trace and a negative eigenvalue. Since the set is connected, is it possible to have a straight line connecting them to conclude that the sum will be singular for some x (zero eigenvalue)? – M.A Apr 22 '15 at 14:39
  • No, because the trace is the sum of the eigenvalues. Just because the trace is zero doesn't mean that any of the eigenvalues are. – Michael Grant Apr 22 '15 at 14:42
  • I see your point, however, I have a conjecture along these lines: if the trace of each matrix is positive, is it possible to conclude that the sum is nonsingular for all positive $x$ provided that F, and G in your terminology have full ranks? – M.A Apr 22 '15 at 14:46
  • That seems plausible, yes. But I do not know. – Michael Grant Apr 22 '15 at 14:50
1

Given matrices ${\rm A}_1, {\rm A}_2, \dots, {\rm A}_n \in \Bbb R^{d \times d}$, decide the existence of a vector $x \geq {\Bbb 0}_n$ such that

$$\mbox{rank} \left( \sum_{k=1}^n x_k {\rm A}_k \right) < d$$


Since the nuclear norm is a convex proxy for the rank, we could solve the following convex program

$$\begin{array}{ll} \underset{{\rm x} \in \mathbb R^n}{\text{minimize}} & \left\| \displaystyle\sum_{k=1}^n x_k {\rm A}_k \right\|_* \\ \text{subject to} & x \geq {\Bbb 0}_n\end{array}$$

Let ${\rm x}^{\min}$ denote the minimizer of the aforementioned convex program. If

$$\mbox{rank} \left( \sum_{k=1}^n x_k^{\min} {\rm A}_k \right) < d$$

we are done. If not, we wasted a few minutes of our lives. Note that we did not require that the given matrices ${\rm A}_1, {\rm A}_2, \dots, {\rm A}_n$ be rank-$1$.

  • Have you done some numerical experiment to see the performance? – River Li Mar 17 '21 at 10:43
  • @RiverLi I certainly did not. I often use nuclear norm minimization with small matrices and it seems easy enough. Here is an example with a $6 \times 6$ matrix. Things may be very different when working with large matrices. Nuclear norm minimization is done via semidefinite programming (SDP) and SDPs of "size" greater than $10^5$ can be problematic. – Rodrigo de Azevedo Mar 17 '21 at 11:11
  • It is a nice example for SOS in that link. I think CVX can work here for not very large size. – River Li Mar 17 '21 at 11:47
  • This is cool! I might be missing something, but it seems this isn’t a “necessary and sufficient” type of procedure. In other words: it may be that there exist instances where such an x exists, but the nuclear norm minimizer will not be such an x. My intuition is that this procedure should work with high probability for certain random inputs. – Chris Harshaw Jul 01 '23 at 19:33