1

Let $n>0$, let $H_n$ be the set of $n \times n$ Hermitian matrices, and let $M_n$ the set of $n \times n$ matrices. Let

$$ K := \left\{ (a,b) \in H_n \times M_n : a + \frac12 b b^* \preceq 0 \right\},$$

where $\preceq$ is to be understood as "negative semidefinite", i.e., the opposite is positive semidefinite. I believe $K$ is a convex closed set, because its characteristic function is a Legendre transform, as shown in lemma 11 of this paper which I'm working on (I took $N = 1$).

I would like to compute, exactly or with an iterative algorithm, the projection on $K$ for the $l^2$ norm build from the Frobenius norm

$$\| (a,b) \| = \sqrt{\|a\|_F^2 + \|b\|_F^2 } ,$$

that is to solve, given $M\in H_n \times M_n $,

$$ \min_{q\in K} \|M-q\|.$$

This problem is somewhat similar to this question, where we see how to compute the projection on positive semidefinite matrices and an interesting subquestion. However, I can't conclude.


What I have done so far

  • I solved the problem in dimnension 1, that is compute the euclidean projection on $$K= \{(a,b) \in \mathbb R \times \mathbb C \ | \ a + \frac{|b|^2}{2} \leq 0 \}.$$ this problem can be solved using a lagrangian multiplyer and I have coded the solver.

  • I answered the question for the case when matrices are diagonal, that is, $M$ is a couple of diagonal matrices. The solution is a couple $q$ of diagonal matrices defined essentially by identifying $M_n^2$ with $M_n(\mathbb R^2)$ : $$p_M : \begin{array} {ccccc} M_n^2 & \to & M_n(\mathbb R^2) & \to & M_n(\mathbb R^2) & \to & M_n^2 \\ (x_1,x_2) & \mapsto & [(x_1^i,x_2^i)] & \mapsto & [p(x_1^i,x_2^i)] & \mapsto & (p_1(x_1^i,x_2^i) , p_2(x_1^i,x_2^i)) \end{array} $$ where $p : R \times \mathbb C \to R \times \mathbb C$ is the 1d projection and $p_M$ is the matrix projection i'm searching for.

  • I was hoping to use this to solve the problem, say, in $H_n \times H_n$, which I might be happy with, using a trick like Von Neumann's inequality like in this question. But I can't conclude if matrices don't commute, because i would actually need components of M to commute, or components of $p(M)$.

So, if anyone has any remark or hint one this, I would sure be very grateful.

Tripo
  • 43
  • A potentially useful observation: $(a, b) \in K$ if and only if the matrix $$ \pmatrix{-2a & b\ b^* & I} $$ (where $I$ denotes an identity matrix) is positive semidefinite. Incidentally, this gives you an alternative proof of the set's convexity. – Ben Grossmann Mar 06 '25 at 17:08
  • Another observation: because your norm is unitarily invariant, you can reduce this problem to the case where $b$ is "diagonal" with positive entries (via SVD) or to the case where $a$ is diagonal (via spectral decomposition) – Ben Grossmann Mar 06 '25 at 17:18
  • 1
    I don’t expect any closed-form solution. This blog entry by Higham may be useful. Also, in Ben Grossmann’s reformulation in the comment above, note that $\left|\pmatrix{-2a&b\ b^\ast&I}\right|_F^2=4|a|_F^2+2|b|_F^2+$ constant. The factor $2$ makes the weights of $|a|_F^2$ and $|b|_F^2$ uneven. You may consider redefining $K$ using the condition $a+\frac{bb^\ast}{\sqrt{2}}\le0$ instead, or redefining $|(a,b)|$ as $\sqrt{4|a|_F^2+2|b|_F^2}$ or $\sqrt{2|a|_F^2+|b|_F^2}$. – user1551 Mar 06 '25 at 19:15
  • Thank you very much for your help. @BenGrossmann , could you please explain why your first remark is true ? I'm not so used to dealing with positive matrices. I tried to write something like $<x, \mathcal A x>$ with $\mathcal A$ your matrix but could not conclude. Either way, this looks quite promising along with user1551's comment. – Tripo Mar 07 '25 at 14:15
  • @Tripo It comes from the consideration of the Schur complement. In general, a Hermitian matrix of the form $$ M = \pmatrix{A & B\B^* & C} $$ (for which $C$ is invertible) will be positive semidefinite if and only if both $C$ is positive definite (which is clear in this case) and the Schur complement $$ M/C = A - BC^{-1}B^* $$ is also positive semidefinite. – Ben Grossmann Mar 07 '25 at 15:36
  • @Tripo Working backwards from a matrix positivity condition to a block matrix whose Schur complement yields the same inequality is a common and useful trick. – Ben Grossmann Mar 07 '25 at 15:38

1 Answers1

0

What do you think of this, based on the help of Ben Grossmann and user1551 : for $q =(a,b) \in H_n \times M_n$, denote $$M_q = \pmatrix{-a & \frac{b}{\sqrt{2}} \\\frac{b^*}{\sqrt{2}} & I}.$$

We have $\|M_q\|_F = \|q\| +cst$ and $M_q \geq 0 \iff q \in K.$ Then for $m$ in $H_n \times M_n$, we have

$$\underset{q\in K}{\operatorname{argmin}} \|m-q\| = \underset{M_q \geq 0}{\operatorname{argmin}} \|M_m-M_q\|$$

So that the problem is equivalent to projecting on semidefinite positive matrices. Thank you again for the help.

Tripo
  • 43