3

I heard Cayley did the case $n=2$ and then Hamilton generalized the result. How is this done? Assume it works for an $n\times n$ matrix and show $n+1?$

Edit: Sorry, Cayley showed it for $3\times 3$, I mean Frobenius.

user26857
  • 53,190
mtheorylord
  • 4,340
  • You changed the question quite brutally as I was answering it, it seems... – Mariano Suárez-Álvarez Dec 09 '16 at 03:42
  • Because I want the proof, not the history. Sorry My bad. – mtheorylord Dec 09 '16 at 03:43
  • But you got the upvote! – mtheorylord Dec 09 '16 at 03:43
  • I don't care for the uovote, really. I've had my share already. – Mariano Suárez-Álvarez Dec 09 '16 at 03:44
  • See Thomas Hawkins, The Mathematics of Frobenius in Context: A Journey Through 18th to 20th Century Mathematics (preview available from Goolge Books), pp.224-226, sec. 7.5.1. Apparently, Frobenius directly proved that the minimal polynomial divides the characteristic polynomial, but his proof relies on the series expansion of $(zI-A)^{-1}$ and it probably does not work on all fields. – user1551 Dec 09 '16 at 05:28
  • I don't know for the history, but using the Jordan normal form might be a good idea for proving it in any field cc @user1551 – reuns Dec 09 '16 at 09:20
  • 1
    @user1952009 Jordan form actually complicates things. A simpler proof is to note that a matrix of $n^2$ indeterminates is diagoalisable, in which case the theorem is trivial. See this comment or this question. Another proof is to note that if $q(x)=adj(xI-A)$, the char poly is $q(x)(Ix-A)$. Substitute $A$ for $x$ and we are done. For more details, search for Polynomials over Noncommutative Rings and the Cayley Hamilton Theorem on Google. – user1551 Dec 09 '16 at 10:25
  • @user1551 So the matrix $M \in M_n(K(x_{1,1},\ldots,x_{n,n}))$ whose entries are $M_{i,j} = x_{i,j}$ an indeterminate is diagonalizable in the form $P D P^{-1}$ where the entires of $P,D,P^{-1}$ belong to the algebraic closure of $K(x_{1,1},\ldots,x_{n,n})$. It is "simpler" (but much more complicated in the same time). When historically has this proof been found ? – reuns Dec 09 '16 at 10:33
  • @user1551 And I prefer the proof with $\det(M)I = M \ adj(M)$, which is I admit much simpler. – reuns Dec 09 '16 at 10:37
  • 1
    @user1952009 Both proofs works on all commutative rings (not just fields or rings that are embeddable in fields), and the second proof is the correct version of the usual falsy proof, which says that $\chi(A)=\det(AI-A)=0$. Personally I like the first one more, because the second one makes use of adjugate matrix, but without going the full-fledged multilinear algebra path, the only coordinate-free definition of adjugate depends on Cayley-Hamilton theorem. P.S. as to the history of these proofs, I don't know. – user1551 Dec 09 '16 at 10:38

2 Answers2

4

The concept of the minimal polynomial was introduced by Frobenius in his landmark 1878 paper. He used the language of bilinear forms and the theory of elementary divisors introduced by Weierstrass. He applied the minimal polynomial (called by him 'the degree of equation of the smallest degree $\varphi(A) = 0$') to give one of the first complete proofs of the Cayley-Hamilton theorem.


Hamilton had shown this result for quarternions and Cayley had proved it for matrices of orders $N=2$ and $N=3$. Frobenius however proved the theorem by showing that the minimal polynomial $\varphi$ of a matrix(or bilinear form) $A$ is a divisor of the characteristic polynomial $\psi$. Hence $\varphi(A) =0 \Rightarrow \psi(A) = 0$.

You can also refer to "Frobenius and the symbolical algebra of matrices" in the Archive for History of Exact Sciences here. Hope it helps.

3

He didn't. He only proved it for three dimensional matrices. In fact, he shows that the matrix satisfies a polynomial but did not realize the polynomial was the characteristic one.

Similarly, Cayley only gave a proof for the case of 2 by 2 matrices — he identified the coefficients of the polynomial correctly. He says somewhere that he knows how to do the general case, but he does not even state what he means. This is in his Treatise on matrices.

  • You are right but how did Frobenius do it? – mtheorylord Dec 09 '16 at 03:42
  • @mtheorylord Showing a matrix satisfies a polynomial equation is easy: if you take $n^2+1$ distinct powers of an $n\times n$ matrix, they have to be linearly dependent (since $n\times n$ matrices form a vector space of degree $n^2$ over the base field. I would believe this is how Frobenious has shown this, but I have no evidence. – Wojowu Dec 09 '16 at 11:04