3

I am trying to solve the problem here: Let $A,B,C,D,$ be commuting $n\times n$ matrices over the field $F$ (it is not given whether any of these matrices are invertible). Show that the determinant of the $2n\times 2n$ matrix $$\begin{bmatrix} A&B\\C&D \end{bmatrix}$$ is $\det(AD-BC)$.

I know that there is an answer given in a paper, which mentions working over $F[x]$. However, I have come up with the following idea:

Motivated by the adjoint formula for the inverse, we get that $$\begin{bmatrix} A&B\\ C&D \end{bmatrix} \begin{bmatrix} D&-B\\ -C&A \end{bmatrix}= \begin{bmatrix} AD-BC & BA-AB\\ CD-DC&AD-BC \end{bmatrix} = \begin{bmatrix} AD-BC & 0\\ 0&AD-BC \end{bmatrix} $$ Hence $$\det \left(\begin{bmatrix} A&B\\ C&D \end{bmatrix}\right)\det\left(\begin{bmatrix} D&-B\\ -C&A \end{bmatrix}\right)= \det(AD-BC)^2$$ But then $$\det\left( \begin{bmatrix} A&B\\ C&D \end{bmatrix}\right)= -\det\left(\begin{bmatrix} B&A\\ D&C \end{bmatrix}\right)= -\det\left(\begin{bmatrix} B&D\\ A&C \end{bmatrix}\right)= \det\left(\begin{bmatrix} D&B\\ C&A \end{bmatrix}\right) $$ Now, thinking of $\det$ in terms of permutations, suppose for a given permutation in the $\det$ sum that we pick $m$ elements in the first $n$ rows from the $-B$ side. Then we must also pick $m$ elements from the last $n$ rows from the $-C$ side; i.e. the sign changes must cancel out. Thus, $$\det\left(\begin{bmatrix} D&B\\ C&A \end{bmatrix}\right)=\det\left(\begin{bmatrix} D&-B\\ -C&A \end{bmatrix}\right)$$ which shows that $$\det \left(\begin{bmatrix} A&B\\ C&D \end{bmatrix}\right)=\pm \det(AD-BC)$$

Is there a way complete the proof from here? All I need to show is that $$\det \left(\begin{bmatrix} A&B\\ C&D \end{bmatrix}\right)\quad\text{ and }\quad \det(AD-BC)$$ have the same sign.

Vasting
  • 2,209

1 Answers1

2

$\textbf{Remarks.}$

i) After "but then", that is correct $$\det\left( \begin{bmatrix} A&B\\ C&D \end{bmatrix}\right)= (-1)^n\det\left(\begin{bmatrix} B&A\\ D&C \end{bmatrix}\right)= \det\left(\begin{bmatrix} D&C\\ B&A \end{bmatrix}\right) $$

ii) Why do you write

$$ \det\left(\begin{bmatrix} B&A\\ D&C \end{bmatrix}\right)= \det\left(\begin{bmatrix} B&D\\ A&C \end{bmatrix}\right)\;? $$

There is no transposition here.

iii) Last line. What do you mean by "same signum " in a field $F$.

EDIT. That follows is a simple proof -using the polynomials-

Let $\overline{F}$ be the algebraic closure of $F$. It is known that $\overline{F}$ is an infinite field.

We know also that the required formula is valid when $A,B,C,D\in M_n(\overline{F})$ and $A$ is invertible; if not, we replace $A$ with $A+\lambda I_n$ where $\lambda\in \overline{F}$ is s.t. $P(\lambda)=\det(A+\lambda I_n)\not= 0$; note that, since $P$ admits at most $n$ roots, the set $S$ of such $\lambda$'s is infinite.

Then, for every $\lambda\in S$, $$Q(\lambda)= \det\left(\begin{bmatrix} A+\lambda I_n&B\\ C&D \end{bmatrix}\right)-\det(AD-BC+\lambda D)=0.$$

The polynomial $Q$ admits an infinity of roots, and therefore, $Q(0)=0$. $\square$