3

If $A$ and $B$ are finite commuting matrices, then I understand their sum $A+B$ has eigenvalues only equal to sums of eigenvalues of $A$ and $B$, however I can't find a reference that explains why this is true.

Assuming $v$ is a shared eigenvector of $A$ and $B$, then $v$ certainly is an eigenvector of $A+B$ with eigenvalue equal to the sum of the eigenvalues from $A$ and $B$, however I'm not sure how to show that this is the only case. From searching online, it seems to be that the only eigenvectors of $A+B$ in this case are the shared eigenvectors between $A$ and $B$, but that seems fairly nontrivial to show. For example, why can't $v$ be a vector that is not an eigenvector of $A$ or $B$, but still the sum $Av+Bv=\lambda v$? Somehow I think that will contradict commutativity, but I'm not sure how.

Any hints or resources someone can point me to?

Vibbz
  • 140
  • If A commute B, then A and B have totally same eigenvectors but can have different eigenvalues(=if v is an eigenvector of A, then it is also one of B).

    Which means $A$, $B$, $A+B$ can be diagonalized under same matrix $P$ like below.

    $A=PD_AP^{-1}$

    $B=PD_BP^{-1}$

    $(A+B)=P(D_A+D_B)P^{-1}$

    – Chunhao Jan 07 '25 at 11:36

1 Answers1

3

In fact, though not simultaneously diagonalizable generally, commuting matrices are always simultaneously upper/lower triangularizable. In this case, by finding a unitary matrix $U$ such that $U^*AU$ and $U^*BU$ are both upper triangular, it is clear that $U^*(A+B)U=U^*AU+U^*BU$ is upper triangular as well. Its diagonal entries are precisely the eigenvalues of $A+B$ by unitary similarity.

See this post for a reference or Theorem 2.3.3 on Matrix Analysis, 2nd Edition by Roger Horn and Charles Johnson.

Bernard Pan
  • 3,044