In Strang's Linear algebra, he proves the following (which has been asked and answered on SE, but my question is about a particular part of his proof):
Let $A$ and $B$ be complex $n\times n$ matricies. Prove that if $AB=BA$, then $A$ and $B$ share a common eigenvector.
His proof is as follows: Let $\lambda$ be a eigenvalue of $A$. Starting from $Ax=\lambda x$, we have $ABx=BAx=B\lambda x=\lambda Bx$. So, $x$ and $Bx$ are both eigenvectors of $A$ sharing the same $\lambda$ (or else $Bx=0$). If we assume the eigenvalues of $A$ are distinct, so the eigenspaces are one dimensional, then $Bx$ must be a multiple of $x$. In other words, $x$ is an eigenvector of $B$ as well as $A$.
My question is, and I am probably overthinking something quite elementary, but when he says "if we assume the eigenvalues of $A$ are distinct...", I agree then with the rest of the argument.... but why can he do that?