3

Let $A,B$ be two Hermitian matrices. Let $\lambda_{k}(A)$ denote the $k$-th largest eigenvalue. I want to prove the following inequality.

$$ \sum_{k=1}^{n}\left|\lambda_{k}(A)-\lambda_{k}(B)\right|^{2} \leqslant\|A-B\|_{F}^{2} $$

where

$$\| A - B \|_{F} := \sqrt{\mbox{tr}(A-B)^{H}(A-B)}$$

is equal to

$$\sum_{k=1}^{n} \sigma_{k}^{2}(A-B)$$

where $\sigma_{k}(A-B)$ is the $k$-th largest singular value of matrix $A - B$.

I think this inequality is related to Weyl inequality.

$$ \lambda_{i+j-1}(A+B) \leqslant \lambda_{i}(A)+\lambda_{j}(B) \leqslant \lambda_{i+j-n}(A+B), \quad \forall i+j \geqslant n+1 $$

As its proposition we can prove

$$ \left|\lambda_{k}(A)-\lambda_{k}(B)\right| \leqslant\|A-B\|, \quad \forall k=1,2, \cdots, n $$

Here $\|A-B\|$ is the largest singular value of $(A-B)$.

But this proposition seems no help to the inequality above. So are there other ways to prove it?

Tree23
  • 1,198

1 Answers1

1

You may assume WLOG that $A\succeq \mathbf 0$ and $B\succeq \mathbf 0$. If not, re-run on $A' = A + \delta I$ and $B' = B + \delta I$ for real $\delta$ large enough. Let $A = Q\Lambda Q^*$ and $B=U\Sigma U^*$. Each diagonal matrix contains eigenvalues in the usual ordering from largest to smallest. Let $C:= Q^*BQ$.

$$\begin{aligned} \sum_{k=1}^{n}\left|\lambda_{k}(A)-\lambda_{k}(B)\right|^{2} &=\big\Vert \Lambda - \Sigma\big \Vert_F^2\\ &=\text{trace}\Big(A^2\Big) - 2\cdot \text{trace}\Big(\Lambda \Sigma\Big)+\text{trace}\Big(C^2\Big)\\ &\leq \text{trace}\Big(A^2\Big) - 2\cdot \text{trace}\Big(\Lambda C\Big)+\text{trace}\Big(C^2\Big)\\ &= \big\Vert \Lambda - C\big \Vert_F^2\\ &=\big\Vert A - B\big \Vert_F^2 \end{aligned}$$

where the inequality

$$\text{trace}\Big(\Lambda C\Big) \leq \text{trace}\Big(\Lambda \Sigma\Big)$$

is justified, e.g., by the von-Neumann trace inequality or application of the trace inequality here.

user8675309
  • 12,193