If $A$ and $B$ are finite commuting matrices, then I understand their sum $A+B$ has eigenvalues only equal to sums of eigenvalues of $A$ and $B$, however I can't find a reference that explains why this is true.
Assuming $v$ is a shared eigenvector of $A$ and $B$, then $v$ certainly is an eigenvector of $A+B$ with eigenvalue equal to the sum of the eigenvalues from $A$ and $B$, however I'm not sure how to show that this is the only case. From searching online, it seems to be that the only eigenvectors of $A+B$ in this case are the shared eigenvectors between $A$ and $B$, but that seems fairly nontrivial to show. For example, why can't $v$ be a vector that is not an eigenvector of $A$ or $B$, but still the sum $Av+Bv=\lambda v$? Somehow I think that will contradict commutativity, but I'm not sure how.
Any hints or resources someone can point me to?
Which means $A$, $B$, $A+B$ can be diagonalized under same matrix $P$ like below.
$A=PD_AP^{-1}$
$B=PD_BP^{-1}$
$(A+B)=P(D_A+D_B)P^{-1}$
– Chunhao Jan 07 '25 at 11:36