8

I found a question in the very end of this video, which essentially is the following:

Let $n$ be some natural number. Suppose there is a function $f : \mathbb{R}^{n \times n} \rightarrow \mathbb{R}$ such that it satisfies the following 2 properties:

  • $f(A\cdot B) = f(A)\cdot f(B),\ \forall A,B \in \mathbb{R}^{n \times n} $
  • $f(\text{diag}(a_1,a_2,...,a_n)) = \prod_{i=1}^{n}a_i$, for any diagonal matrix $\text{diag}(a_1,a_2,...,a_n) \in \mathbb{R}^{n \times n}$ whose diagonal entries (from top-left to bottom-right) are $a_1, ... ,a_n$.

Show that $f(A) = det(A) \text{ (determinant of A)} \; \forall A \in \mathbb{R}^{n \times n}$, i.e., show that the above two properties characterize the determinant function for real square matrices.

My solution :

Suppose $A$ is not invertible. Then, we can find invertible matrices $P$ and $Q$ such that $Q^{-1} A P = A' $ is diagonal with ones or zeros on its diagonal. As $A$ is not invertible, $A'$ must have atleast one zero in the diagonal. Hence, $f(Q^{-1})f(A)f(P) = f(A') = 0$, as $A'$ is diagonal with a zero in the diagonal.

But, for any invertible matrix $C,$ $\ 1 = f(I) = f(CC^{-1}) = f(C)f(C^{-1}) \Rightarrow f(C) \neq 0$. So, $f(Q^{-1})\neq 0$ and $f(P) \neq 0$. Hence $f(A) = 0 = det(A)$ for any non-invertible $A$.

Next, we know that any invertible matrix $X$ can be written as a product of elementary matrices $X=E_1 ... E_n$, so if we can show that for an elementary matrix $E$, $f(E) = det(E)$, then $f(X) = f(E_1)...f(E_n) = det(E_1)...det(E_n) = det(E_1...E_n) = det(X)$, and hence we would be done.

Hence, it suffices to prove that for an elementary matrix $E$, $f(E) = det(E)$.

  • [One useful observation : Suppose $Y$ is a diagonalizable matrix, then $\ \exists$ invertible matrix $M$ such that $D = M^{-1} Y M$ is diagonal. Clearly, $f(D) = det(D) = $ product of diagonal entries, for any diagonal matrix $D$. So, $det(Y) = det(D) = f(D) = f(M^{-1})f(Y)f(M) = f(Y)f(M^{-1}M) = f(Y)f(I) = f(Y)$. So, $f(Y) = det(Y)$ for any diagonalizable $Y$.]

  • If $E$ is an elementary matrix corresponding to multiplication of a row by a non-zero constant $c$, then $E$ would be diagonal with one diagonal entry $c$ and all other diagonal entries being $1$, and hence $f(E) = c = det(E)$.

  • If $E$ is an elementary matrix corresponding to swapping row $i$ and row $j$ ($i\neq j$), then E is symmetric, hence diagonalizable by the spectral theorem for real symmetric matrices. Hence, $f(E) = det(E)$, by above observation.

  • If $E$ is an elementary matrix corresponding to adding $c$ times the ith row to the jth row ($i\neq j$), then it is the square of the matrix $E'$ which corresponds to adding $c/2$ times the ith row to the jth row. So, $f(E) = f(E')^2$. By taking $Z$ as the matrix which swaps row i and j (which is its own inverse), it can be shown that $Z^{-1} E Z$ = $E^T$, so $f(E) = f(E^T)$. As $EE^T$ is symmetric, by the spectral theorem for real symmetric matrices, it is diagonalizable, hence $f(EE^T) = det(EE^T)\Rightarrow f(E)^2 = det(E)^2 = 1^2 = 1.$ As $E'$ as defined before is an elementary matrix of the same type as $E$, $f(E')^2 = 1$ as well. So, $f(E) = f(E')^2 = 1 = det(E)$.

This finishes the proof for real square matrices.

Note that in the above proof, I never used the fact that the field is the field of real numbers, except for the part where I used the spectral theorem to conclude that a real symmetric matrix is diagonalizable. But this is not true in every field (Is symmetric matrix over a field F always diagonalizable?).

So my question is as follows :

Can we give a proof which works in every field, or can we find a field where the two above mentioned properties do not characterize the determinant?

I would be very grateful for any help.

Geometric interpretation (along similar lines of the video from which I picked this question) would be highly appreciated.

Note : For a field with just 2 elements (0 and 1), like $\mathbb{Z}/2\mathbb{Z}$, the solution is trivial, because for a non-invertible $A$, $f(A)=0=det(A)$, and for invertible $B$, $f(B)$ and $det(B)$ are each non-zero, so they must each be $=1$. And hence, we are done in this case.

  • A similar but slightly weaker characterisation of determinant is also given in the second paragraph of the wikipedia page on determinants (https://en.m.wikipedia.org/w/index.php?title=Determinant&wprov=rarw1). It uses triangular matrices instead of diagonal matrices. – Avyaktha Achar Aug 16 '24 at 13:26

1 Answers1

5

Here is an argument which works for any field $K$ with at least $3$ elements.

Some facts that you already proved:

  • $f$ coincides with $\det$ on singular matrices

  • if $M,M'$ are similar, $f(M)=f(M')$

So it suffices to show that $f(M)=\det(M)$ for all $M\in GL_n(K)$

Facts.

  • Any invertible matrix $M$ is a product of transvection matrices (a matrix corresponding to operations $L_i\leftarrow L_i+cL_j$ or $C_j\leftarrow L_j+cC_j$) and a diagonal matrix

  • Any transvection matrix is conjugate to $\begin{pmatrix} 1 & & & \cr & \ddots & & \cr & & 1 & 1 \cr & & & 1\end{pmatrix}$

So it is enough to prove that $f$ sends this matrix to $1$. For, in view of the first condition on $f$ , it is enough to prove that this matrix is a product of conjugate of diagonal matrices.

Since $K$ has at least three elements, one may choose $\lambda\in K\setminus\{0,1\}$. Then $\begin{pmatrix} 1 & & & \cr & \ddots & & \cr & & 1 & 1 \cr & & & 1\end{pmatrix}= \begin{pmatrix} 1 & & & \cr & \ddots & & \cr & & 1 & \cr & & &\lambda^{-1}\end{pmatrix}\begin{pmatrix} 1 & & & \cr & \ddots & & \cr & & 1 & 1 \cr & & & \lambda\end{pmatrix}$.

Now set $P=\begin{pmatrix}1 & 1 \cr 0 & \lambda-1\end{pmatrix}.$ Then $P\begin{pmatrix} 1 & 0 \cr 0 & \lambda\end{pmatrix}P^{-1}$. Taking $Q=\begin{pmatrix} I_{n-2} & \cr & P\end{pmatrix}$, we get $Q diag(1,\ldots,1,\lambda) Q^{-1}$, and we are done.

Edit. Here is another proof which includes the case of $\mathbb{F}_2$, apart from maybe on exception.

Thm. we have $f=\det$, in every case, except maybe if $n=2$ and $K=\mathbb{F}_2$.

Again, it is enough to show that $f$ and $\det$ coincide for invertible matrices.

Now an invertible matrix is the product of a diagonal matrix and a matrix of $SL_n(K)$. So it is enough to prove that $f$ sends a matrix of determinant $1$ to $1$.

But, with the assumptions, the subgroup of $GL_n(K)$ generators by commutators $[A,B]=ABA^{-1}B^{-1}$ is precisely $SL_n(K)$ (classical result). Now, the first condition on $f$ ensures that it is trivial on commutators, and we are done.

Final remark. I have not thought yet about the exceptional case, but it shouldn't be difficult to determine all the possibility for $f$ on $GL_2(\mathbb{F}_2)$. In this case , $GL_2(\mathbb{F}_2)\simeq D_3$, and we have an explicit representation by generators and relations: two generators $r,s,$ and three relations $r^3=1,s^2=1, srs^{-1}=r^{-1}$.

Explicitely, one may take $r=\begin{pmatrix}0 & 1 \cr 1 & 1\end{pmatrix}$ and $s=\begin{pmatrix}0 & 1\cr 1 & 0\end{pmatrix}$ if i'm not mistaken.

GreginGre
  • 16,641
  • 1
    By transvection, do you mean a matrix obtained by adding a non-zero multiple of one row to another? (By elementary matrix, I was referring to one of the following three kinds of matrices : (1) a matrix obtained by multiplying a row of identity matrix by a non zero constant 'c', (2) a matrix obtained by swapping two rows of identity matrix, (3) a matrix obtained by adding a non zero multiple of any one row of identity matrix to another row.) – Avyaktha Achar Mar 08 '24 at 11:24
  • Also, could you give some geometric interpretation (along similar lines of what was done in the video from which I picked this question), at least for real matrices, explaining geometrically why these 2 properties characterise the determinant? The fact that just these 2 properties imply multilinearity is startling! – Avyaktha Achar Mar 08 '24 at 11:29
  • 1
    For your first question: yes, that's what i meant. FOr your second question, no idea. Also, i found a different proof which works in every case except maybe one, so i added it to my post. I – GreginGre Mar 08 '24 at 11:34
  • Ok thank you so much! I've edited the question to include the proof for a field with 2 elements. This, combined with your first proof, finishes the proof for all fields. – Avyaktha Achar Mar 08 '24 at 11:41
  • 1
    I don’t understand your reservations when the field is $\mathbb F_2$. In this case, for every invertible matrix $T$, since $f(T)f(T^{-1})=f(TT^{-1})=f(I)=1$, $f(T)$ has to be nonzero. However, as the field has only two elements $0$ and $1$, $f(T)$ must be equal to $1$. Whether $T$ is a transvection or not is completely irrelevant. – user1551 Mar 08 '24 at 13:18