58

Let $V$ be a vector space of finite dimension and let $T,S$ linear diagonalizable transformations from $V$ to itself. I need to prove that if $TS=ST$ every eigenspace $V_\lambda$ of $S$ is $T$-invariant and the restriction of $T$ to $V_\lambda$ ($T:{V_{\lambda }}\rightarrow V_{\lambda }$) is diagonalizable. In addition, I need to show that there's a base $B$ of $V$ such that $[S]_{B}^{B}$, $[T]_{B}^{B}$ are diagonalizable if and only if $TS=ST$.

Ok, so first let $v\in V_\lambda$. From $TS=ST$ we get that $\lambda T(v)= S(T(v))$ so $T(v)$ is eigenvector of $S$ and we get what we want. I want to use that in order to get the following claim, I just don't know how. One direction of the "iff" is obvious, the other one is more tricky to me.

Mike Pierce
  • 19,406

3 Answers3

64

This answer is basically the same as Paul Garrett's. --- First I'll state the question as follows.

Let $V$ be a finite dimensional vector space over a field $K$, and let $S$ and $T$ be diagonalizable endomorphisms of $V$. We say that $S$ and $T$ are simultaneously diagonalizable if (and only if) there is a basis of $V$ which diagonalizes both. The theorem is

$S$ and $T$ are simultaneously diagonalizable if and only if they commute.

If $S$ and $T$ are simultaneously diagonalizable, they clearly commute. For the converse, I'll just refer to Theorem 5.1 of The minimal polynomial and some applications by Keith Conrad. [Harvey Peng pointed out in a comment that the link to Keith Conrad's text was broken. I hope the link will be restored, but in the meantime here is a link to the Wayback Machine version. Edit: original link just updated.]

EDIT. The key statement to prove the above theorem is Theorem 4.11 of Keith Conrad's text, which says:

Let $A: V \to V$ be a linear operator. Then $A$ is diagonalizable if and only if its minimal polynomial in $F[T]$ splits in $F[T]$ and has distinct roots.

[$F$ is the ground field, $T$ is an indeterminate, and $V$ is finite dimensional.]

The key point to prove Theorem 4.11 is to check the equality $$V=E_{\lambda_1}+···+E_{\lambda_r},$$ where the $\lambda_i$ are the distinct eigenvalues and the $E_{\lambda_i}$ are the corresponding eigenspaces. One can prove this by using Lagrange's interpolation formula: put $$f:=\sum_{i=1}^r\ \prod_{j\not=i}\ \frac{T-\lambda_j}{\lambda_i-\lambda_j}\ \in F[T]$$ and observe that $f(A)$ is the identity of $V$.

  • 14
    You've listed a document by Keith Conrad. I believe this one by the same author also has a very simple proof for the question being asked without reference to a minimal polynomial. I found it helpful since I didn't know what a minimal polynomial is: http://www.math.uconn.edu/~kconrad/blurbs/linmultialg/simulcomm.pdf – Sherif F. Sep 24 '14 at 17:26
  • @Pierre-Yves Gaillard , you write, ``let $S$ and $T$ be diagonalizable endomorphisms of $V$.'' Please provide a definition of and diagonalizable endomorphisms of $V$. In addition, please include an example of a diagonalizable endomorphisms of $V$. I believe that if you do so, your solution will be more easily understood to a wider audience (like me). – Michael Levy Mar 08 '22 at 16:50
  • 1
    @MichaelLevy - The definition of diagonalizable endomorphisms and examples of such are given in the links https://kconrad.math.uconn.edu/blurbs/linmultialg/minpolyandappns.pdf and https://kconrad.math.uconn.edu/blurbs/linmultialg/simulcomm.pdf. (Both texts were written by Keith Conrad.) – Pierre-Yves Gaillard Mar 08 '22 at 17:33
  • I am sorry, but the definition of diagonalizable endomorphism is in neither of these two references; nor could I find it elsewhere. In the context of this problem, do you mean a linear operator, $A : V \to V$, where $A$ is diagonalizable? – Michael Levy Mar 08 '22 at 23:39
  • 2
    @MichaelLevy - Yes, in the context of this problem, the expressions "endomorphism" and "linear operator" are synonymous. – Pierre-Yves Gaillard Mar 09 '22 at 00:35
  • @Pierre-Yves Gaillard , Considering a linear transformation's characteristic polynomial over the complex field, is it true that if its characteristic equation has roots with multiplicity 1, and only roots with multiplicity 1, then its minimal polynomial and its characteristic polynomial are one and the same? – Michael Levy Mar 09 '22 at 02:11
  • In Theorem 3.2 in the pdf you cite [1] , it states, "A linear operator on $V$ whose characteristic polynomial is a product of linear factors in $F [T ]$ with distinct roots is diagonalizable." I understand that $V$ is the vector space; and that $F$ is the field. It looks like $T$ is an eigenvalue (but I am not sure). I understand what a linear factor is. I am, however, unclear what the notation $F[T]$ denotes, and what a linear factor in $F[T]$ would. Might you explain? And if possible might you provide an example? [1] www.math.uconn.edu/~kconrad/blurbs/linmultialg/minpolyandappns.pdf – Michael Levy Mar 09 '22 at 08:32
  • 1
    @MichaelLevy - Answer to Question 1: yes. What is $T$? It is an indeterminate, see https://en.wikipedia.org/wiki/Polynomial_ring#Definition_(univariate_case). A linear factor in $F[T]$ is a factor of the form $T-a$ with $a\in F$. Example: $T-1$ is a linear factor of $T^2-1$. – Pierre-Yves Gaillard Mar 09 '22 at 12:30
  • 1
    I am sorry, but the link seems broken. – Harvey Peng Jul 03 '24 at 12:16
  • @HarveyPeng - Thanks! I made an edit. – Pierre-Yves Gaillard Jul 03 '24 at 13:22
27

You've proven (from $ST=TS$) that the $\lambda$-eigenspace $V_\lambda$ of $T$ is $S$-stable. The diagonalizability of $S$ on the whole space is equivalent to its minimal polynomial having no repeated factors. Its minimal poly on $V_\lambda$ divides that on the whole space, so is still repeated-factor-free, so $S$ is diagonalizable on that subspace. This gives an induction to prove the existence of a simultaneous basis of eigenvectors. Note that it need not be the case that every eigenvector of $T$ is an eigenvector of $S$, because eigenspaces can be greater-than-one-dimensional.

Edit: Thanks Arturo M. Yes, over a not-necessarily algebraically closed field, one must say that "diagonalizable" is equivalent to having no repeated factor and splits into linear factors.

Edit 2: $V_\lambda$ being "S-stable" means that $SV_\lambda\subset V_\lambda$, that is, $Sv\in V_\lambda$ for all $v\in V_\lambda$.

Mike Pierce
  • 19,406
paul garrett
  • 55,317
0

As you proved, each eigenspace of S is invariant (stable) for T. As S is diagonalisable, we can split domain V of both operators to direct sum of eigenspaces of S. If we now choose a base by firs choosing a base of 1st eigenspace of S (corresponding to eigenvalue λ1), then base of 2nd etc, we get a base of V on which S matrix is diagonal and on its diagonal it first has λ1 (1 or more times, depending of dimension of corresponding eigenspace), then λ2 etc.

Simultaneously, T matrix is block diagonal on this base (due to proven stability of eigenspaces).Now, it's diagonal sunblocks B1, B2,... are also diagonalisable (as their minimal polynom divides minimal polynom of T which has liner factorization without multiple roots, so divisor must have this same property which is equivalent to diagonalisability).

If we now duagonize B1 (with base change on eigenspace 1, using matrix P1 and P1^-1),B2 (with P2 and P2^-1) and if we create blockdiagonal matrice P with diagonal blocks P1, P2... and P^-1 (blockdiagonal with blocks P1^-1,P2^-1...),we will diagonalize T with those two matrices in new base, and S will remain same matrix in this new base, and it is already diagonal.

This is a bit long argument, but in my opinion relatively elementary compared to linked proofs.