1

According to Axler's Down with Determinants!, for any linear transformation $T:F^n\rightarrow F^n$, with $F\subseteq\mathbb{C}$ a subfield of the complex numbers, the existence of a value $\lambda\in\mathbb{C}$ that solves a certain polynomial $p_T\in \mathbb{C}[x]$ (i.e., $p_T(\lambda)=0$) proves the existence of eigenvalues for $T$ (because $T-\lambda I$ would be injective otherwise). It is further shown that $$\mathrm{det}(T)=\prod_{p_T(z)=0}z$$ But, after utilizing this approach a couple of times, it becomes apparent that some assumptions aren't necessary. To be more specific:

  • $F\subseteq\mathbb{C}$ seems to be replaceable with $F\subseteq\overline{F}$, where $\overline{F}$ is the splitting field of $p_T$.
  • Likewise, $\mathbb{C}[x]$ seems to be replaceable with $F[x]$ (this means $p_T$ usually has coefficients in $F$).

An example of this is the zero-determinant matrix: $$T=\left[\begin{matrix}1&1\\1&1\end{matrix}\right]$$ Where $T:\mathbb{Q^2\rightarrow Q^2}$. Then, $$p_T(x)=x^2-2x=x(x-2)$$ so $p_T\in\mathbb{Q}[x]$ and $\overline{F}=\mathbb{Q}(0,2)=\mathbb{Q}$. Is there any proof for the former two statements with a general transformation? Is there a counterexample?

1 Answers1

3

I'm not sure exactly what you're asking for a proof of, but the following is true:

Theorem: Let $T \in M_n(F)$ be a matrix over an arbitrary field $F$ and $p(t) \in F[t]$ any polynomial (monic, WLOG) such that $p(T) = 0$. Let $L/F$ be a splitting field for $p$. Then the extension of scalars $T \in M_n(L)$ has an eigenvector in $L^n$.

The proof is the same as the one you're familiar with, we just factor $p(T)$ over $L$ as

$$\prod_{i=1}^m (T - r_i) = 0$$

where $m = \deg p$ and $r_i \in L$ are the roots of $p$. Then some $T - r_i$ does not act injectively on $L^n$. None of this is specific to working with subfields of $\mathbb{C}$.

However, you've garbled the argument. Axler constructs two different polynomials; one proves the existence of eigenvalues and is just any polynomial such that $p(T) = 0$ above. The other is the characteristic polynomial $p_T$, which without determinants Axler must define in terms of the eigenvalues and their generalized eigenspaces; this polynomial cannot (in this approach) be used to prove the existence of eigenvalues.

(Personally I also think Axler is completely wrong about determinants. They're wonderful! Axler's approach completely obscures one of the most important facts about determinants, which is that they are given by a polynomial in the entries of a matrix, and in particular vary continuously; I have no idea how you would see this in Axler's approach without comparing to a different, better definition. See e.g. here for a discussion. Many of Axler's concerns about the determinant are handled elegantly by Skip Garibaldi's approach in The characteristic polynomial and determinant are not ad hoc constructions.)

Qiaochu Yuan
  • 468,795
  • Thanks! This does clarify a lot, and it is valuable information. However, I did use the same polynomial approach that proved the existence of eigenvalues to find said eigenvalues. If Axler's proof is correct, the same polynomial that finds eigenvalues (in its monic presentation) holds enough information to extract them. – Simón Flavio Ibañez Feb 25 '25 at 00:51
  • @Simón: that is not correct. $p$ could have other roots, or it could have roots with the wrong multiplicities. In your example the characteristic polynomial is $p_T(x) = x(x - 2)$ but the given $T$ also satisfies, for example, $p(x) = x^2 (x - 2)^3 (x - 5)$. You cannot use an arbitrary $p$ to compute the determinant, it must be the characteristic polynomial specifically. – Qiaochu Yuan Feb 25 '25 at 01:09
  • For an explicit proof, take a vector space $V$ of dimension $n$ and a linear operator $T$. Let $0≠v\in V$. Then, the set of $n+1$ vectors $$\bigcup_{i≤n}{k_iT^iv}\subseteq V$$, with $k_n=1$ is not linearly independent, so the vector $s(k_0,k_1,...,k_n)\in V$ satisfies: $$s(k_0,...,k_n)=\sum_{i≤n}k_iT^iv=\mathbf{0}$$ The $k_i\in F$ values can be extracted from the resulting system of equations, and then plugged into an $n$-th degree polynomial in $p\in F[x]$. Finally, the roots of $p$ are necessarily eigenvalues of $T$. – Simón Flavio Ibañez Feb 25 '25 at 01:16
  • @Simón: this procedure does not uniquely specify the polynomial $p$. If you pick the smallest such $p$ you will get the minimal polynomial, not the characteristic polynomial, and the multiplicities of the roots will be smaller in general. Take for example $T = 2I$; the minimal polynomial is $x - 2$ but the characteristic polynomial is $(x - 2)^2$ and the determinant is $4$, which must be computed from the characteristic polynomial. – Qiaochu Yuan Feb 25 '25 at 01:22
  • @Qiaochu_Yan i'm aware that not every polynomial does the trick, but, according to my proof, $p_T$ can be extracted without the more complex notion of an eigenspace. – Simón Flavio Ibañez Feb 25 '25 at 01:22
  • That is not correct. How do you propose to calculate the determinant of $T = \begin{bmatrix} 2 & 0 & 0 \ 0 & 2 & 1 \ 0 & 0 & 2 \end{bmatrix}$? – Qiaochu Yuan Feb 25 '25 at 01:23