-1

I already know that if $AB = BA$ and $A$ has no repeated eigenvalues, then $A$ and $B$ have the same eigenvectors.

I also know that if $A$ and $B$ have same eigenvectors then, they commute $AB=BA$, and to give an affirmative answer to the question, I need to prove the opposite now with a condition that $A$ has no repeated eigenvalues.

J. W. Tanner
  • 63,683
  • 4
  • 43
  • 88

2 Answers2

6

Not true. Take any $n \times n$ matrix $A$ with no repeated eigenvalues, and $B = I$ (the $n \times n$ identity matrix). Then every nonzero vector is an eigenvector of $B$, but most are not eigenvectors of $A$.

What is true is that every eigenvector of $A$ is an eigenvector of $B$.

Robert Israel
  • 470,583
0

Case 1:without the condition that $A$ has no repeated eigenvalues.

What we can determine is that $A$ and $B$ have at least one common eigenvector (though we cannot guarantee that all eigenvectors are the same).

Let $\lambda$ be any eigenvalue of $A$, and let the eigenspace corresponding to $\lambda$ be $E_A(\lambda) =\{ x \in \mathbb{C^{n}} \mid Ax = \lambda x \}$.

For $\forall \ x \in E_A(\lambda)$, we have
$$ A(Bx) = (AB)x = (BA)x = B(Ax) = B(\lambda x) = \lambda (Bx), $$ which implies that $Bx \in E_A(\lambda)$. Hence, we can regard $T(x) = Bx$ as a linear transformation defined on $E_A(\lambda)$.

We know that any linear transformation in a finite-dimensional linear space over the field of complex numbers must have eigenvalues and corresponding eigenvectors. And the eigenvector $x_0$ of the linear transformation $T$ is therefore eigenvector of $B$ (Note that the eigenvectors of $B$ are not necessarily eigenvectors of $T$, as $T$ is defined only on the subspace $E_A(\lambda)$, not on the entire space $\mathbb{C^{n}}$). Since $x_0 \in E_A(\lambda)$, $x_0$ is naturally also an eigenvector of $A$. Thus, $A$ and $B$ share a common eigenvector $x_0$.

Case 2:with the condition that $A$ has no repeated eigenvalues.

An $n$-by-$n$ matrix $A$ has $n$ distinct eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ (each eigenvalue corresponds to a $1$-dimensional eigen-subspace), with the corresponding eigenvectors $x_1, x_2, \dots, x_n$ that are linearly independent. Taking $\lambda_1$ and $x_1$ as an example, $x_1$ is definitely an eigenvector of the linear transformation $T$ defined on $E_A(\lambda_1)$ (where $E_A(\lambda_1) = \text{span}\{x_1\}$), and is naturally also an eigenvector of $B$. The same holds for $x_2, \dots, x_n$. Therefore, $x_1, x_2, \dots, x_n$ are also the $n$ linearly independent eigenvectors of the matrix $B$. However, we cannot conclude that all the eigenvectors of $B$ are also eigenvectors of $A$, because $B$ can have repeated eigenvalues. If $x_1$ and $x_2$ are two linearly independent eigenvectors of $B$ corresponding to the same eigenvalue, then any linear combination of them (with non-zero coefficients) is also an eigenvector of $B$, but it is certainly not an eigenvector of $A$. As given in the counterexample above by Robert Israel.

In summary:
$1$. If $AB = BA$, then $A$ and $B$ must have at least one common eigenvector.
$2$. If $AB = BA$ and $A$ has no repeated eigenvalues, then the eigenvectors of $A$ are necessarily the eigenvectors of $B$. Thus, $A$ and $B$ share the same set of eigenbases.