Write $A$ for the generic matrix (comprised of indeterminates). In their constructive commutative algebra book, Lombardi and Quitte write that since the determinant of the family $(e_1,Ae_1,\dots,A^{n-1}e_1)$ is nonzero, the generic matrix is similar to the companion matrix of its characteristic polynomial over the fraction field of $\mathbb Z[x_{ij}]$. I am guessing this remark does not tacitly assume Jordan normal form, so I would like an explanation.
- 14,390
-
Write the matrix of $A$ in the basis $e_1,Ae_1,\dots,A^{n-1}e_1$ (they assume $e_1$ is cyclic, apparently, and hence that $A$ has a cyclic vector). – Conifold Sep 07 '21 at 19:22
-
@Conifold If $A$ has a cyclic vectors, then most vectors are $A$-cyclic (i.e. $A$-cyclicity is generic). Perhaps that factors in here – Ben Grossmann Sep 07 '21 at 19:28
-
@Arrow Could you explain what they mean by the "determinant of the family" in this context? – Ben Grossmann Sep 07 '21 at 19:30
-
@BenGrossmann I think they mean that "generic" matrix has a cyclic vector. This is true since the characteristic polynomial is generically the minimal polynomial, which is equivalent to having cyclic vectors. – Conifold Sep 07 '21 at 19:35
-
@Conifold I think statement about the determinant of the family has a stronger implication: not only does a generic matrix have a cyclic vector, it is also true that the standard basis vector $e_1$ is a cyclic vector of a generic matrix. – Ben Grossmann Sep 07 '21 at 19:38
1 Answers
This is not a constructive proof, but one may argue as follows. Since the elements of $A$ are independent indeterminates, if $\det\pmatrix{e_1&Ae_1&\cdots&A^{n-1}e_1}=0$, then the determinant is also zero when the elements are specialised to any numeric values of any commutative ring. That is, $\det\pmatrix{e_1&Me_1&\cdots&M^{n-1}e_1}$ is always zero for every matrix $M$ over every commutative ring. More specifically, pick any commutative ring $R$ and any matrix $M=(m_{ij})_{i,j\in\{1,2,\ldots,n\}}\in R^{n\times n}$. Define a ring homomorphism $\sigma:\mathbb Z[x_{ij}]\to R$ by $\sigma(x_{ij})=m_{ij}$ and $\sigma(k)=k$ for every indices $i,j$ and every integer $k$, where the latter means that $$ \sigma(\underbrace{1_{\mathbb Z}+\cdots+1_{\mathbb Z}}_{k \text{ times}})=\underbrace{1_R+\cdots+1_R}_{k \text{ times}}. $$ Now, if $\det\pmatrix{e_1&Ae_1&\cdots&A^{n-1}e_1}=0$, since the determinant is a multinomial in $\mathbb Z[x_{ij}]$, we have $$ \det\pmatrix{e_1&Me_1&\cdots&M^{n-1}e_1} =\sigma\left(\det\pmatrix{e_1&Ae_1&\cdots&A^{n-1}e_1}\right)=0. $$ Yet, the determinant on the LHS is clearly nonzero for some $R$ and some $M$ (e.g. when the ring is $\mathbb Z$ and $M\ne I$ is a circulant permutation matrix). So, $\det\pmatrix{e_1&Ae_1&\cdots&A^{n-1}e_1}$ must be nonzero.
It follows that $\mathcal B=\{e_1,Ae_1,\ldots,A^{n-1}e_1\}$ form an ordered basis of $F^n$, where $F$ denotes the field of fractions of $\mathbb Z[x_{ij}]$. By construction, the matrix representation of the linear map $L:x\mapsto Ax$ with respect to the basis $\mathcal B$ is a companion matrix $C$. Now $A$ and $C$ are matrix representations of $L$ under different bases. Hence the result follows.
Remark. Since the characteristic polynomial of a companion matrix is the minimal polynomial, Cayley-Hamilton theorem (over a commutative ring) is also proved in the above. There is another proof of Cayley-Hamilton theorem that also uses the indeterminate trick: since the elements of $A$ are independent indeterminates, it is diagonalisable over the algebraic closure of the field of fractions of $\mathbb Z[x_{ij}]$ (because otherwise every characteristic polynomial of a numeric matrix will have a repeated root, which we known isn't true). Cayley-Hamilton theorem then boils down to the trivial case for diagonal matrices. While the current proof using companion matrices is longer, it is easier because it doesn't involve the existence of algebraic closure (which is usually not taught in a parallel introductory abstract algebra course).
- 149,263