2

$A$ is an $n \times n$ real matrix.

prove that

$$A=-A^T \iff AA^T=-A^2$$.

Thanks.

Julien
  • 45,674

1 Answers1

5

Here's a proof for the nontrivial direction, right to left in the question. Like any real square matrix, $A$ can be represented as the sum $A=S+G$ of a symmetric matrix $S=(A+A^T)/2$ and an antisymmetric matrix $G=(A-A^T)/2$. The given equation $AA^T=-A^2$ can be rewritten as $(S+G)S=0$ or $S^2+GS=0$. Our task is to prove that $S=0$. Suppose not. By the spectral theorem, $S$ has a non-zero eigenvalue $\lambda$; let $v$ be a column eigenvector. So $v\neq0$ and $Sv=\lambda v$. Therefore $S^2v=\lambda^2v$ and, since $S^2+GS=0$, we get $$ \lambda Gv=G(\lambda v)=GSv=-S^2v=-\lambda^2v. $$ As $\lambda\neq0$, we infer $Gv=-\lambda v$. Using this, we compute $$ \lambda v^Tv=-v^TGv=+v^TG^Tv=(Gv)^Tv=-\lambda v^Tv, $$ where the second equality uses that $G$ is antisymmetric. So $2\lambda v^Tv=0$. But this is absurd, since $\lambda$ and $v^Tv$ (and 2) are non-zero.

Andreas Blass
  • 75,557