0

Let $F$ be an algebraically closed field and $V$ be a finite dimensional vector space over F. Let $S, T$ be two linear operators of $V$, with $ST = TS$. Suppose the characteristic polynomial of S has distinct roots.

I want to show 2 things:

  1. Every eigenvector of $S$ is an eigenvector of $T$
  2. If $T$ is nilpotent for some $k \in \mathbb{N}$ then $T=0$

A quick search will show many similar questions to this, but none of them have quite the right setting. Most either showing the converse (if $S,T$ have the same eigenvectors then $ST = TS$). Or dropping the assumption that the field is algebraically closed and $S$ having a characteristic polynomial that factors into distinct roots. It can be show pretty easily that without this assumption the assertion is actually false, an easy counter example being $S = Id$ and $T = $ (almost) anything. Then $S$ has every vector as an eigenvector which won't be true for most matrices $T$.

So the answer must have something to do with the distinct (linear) roots of the characteristic polynomial of $S$. I know that distinct eigenvalues correspond to distinct eigenvectors, and I'd like to get a basis consisting of these eigenvectors but there's nothing that stops $0$ from being an eigenvalue of $S$. It's also very easy to show that if $v$ is an eigenvector of $S$, then so is $Tv$ but that hasn't gotten me anywhere.

At this point I'm lost. Any help would be nice.

S.H.
  • 71
  • 5
  • "distinct eigenvectors" doesn't mean anything to me. What did you mean by that? – Brian Moehring Jan 11 '25 at 22:19
  • I thought that distinct eigenvalues correspoded to linearly independent eigenvectors. That's obviously not true now that I think about it, but clearly n distinct eigenvalues means n distinct eigenvectors they just may not be linearly independent like I thought – S.H. Jan 11 '25 at 22:32
  • 2
    Oh, $\dim(V)$ distinct eigenvalues does imply any set of corresponding eigenvectors forms a basis of $V$. I just don't know what you could mean by "distinct eigenvectors". Interpreting it literally, we can always find "distinct eigenvectors", but I've never seen an argument where we needed to find distinct eigenvectors. – Brian Moehring Jan 11 '25 at 22:40
  • 2
    When I said 'distinct' I meant 'linearly independent' I should've been more clear – S.H. Jan 11 '25 at 22:46

1 Answers1

0

Regarding 1.:

If $Sv = \lambda v$ then $$STv = TSv = T\lambda v = \lambda Tv$$ so $Tv $ is an eigenvector for the eigenvalue $\lambda$. This means that $Tv = \mu v$, since the eigenvalues of $S$ are pairwise distinct, so the eigenspaces of $S$ are one dimensional.

Regarding 2.:

make use of the fact that you have a basis of eigenvectors for $T$ (this is implied by 1.), and the fact that for an eigenvector $v$ with eigenvalue $\mu$, $T^k v = \mu^k v$. Then conclude that $T^k = 0$ implies $\mu = 0$ for every eigenvalue.

Thomas
  • 23,023