1

Prove that $\{ A \in \mathbb{R}^{n \times n} \mid A \text{ is symmetric}\}^{\bot} = \{ A \in \mathbb{R}^{n \times n} \mid A \ \text{is skew-symmetric}\}$ with $\langle A, B \rangle = \operatorname{Tr}(A^\top B)$.

(Note: I am aware of the other 3 questions posted about this problem on this forum, this one too.)

Proof.

Let $\{A\in\mathbf{R}^{n\times n}\mid A\text{ symmetric}\}=W$ and $\{A\in\mathbf{R}^{n\times n}\mid A\text{ antisymmetric}\}=V$.

We first prove the inclusion $\supseteq$. Let $A\in V$, then we have to show that $A\in W^\perp$, or equivalently $A\perp B$ for all $B\in\mathbf{R}^{n\times n}$ symmetric. We have $$\langle A,B\rangle=\operatorname{Tr}(A^\top B)=\operatorname{Tr}(-AB)=-\operatorname{Tr}(AB^\top)=-\operatorname{Tr}(B^\top A)=-\langle B,A\rangle=-\langle A,B\rangle$$ so $\langle A,B\rangle=0$, therefore $A\in W^\top$.

Now we prove the inclusion $\subseteq$. Let $A\in W^\perp$, then we have to show that $A\in V$.

Consider the inner product $\langle A,A+A^\top \rangle$. Since $(A+A^\top)^\top=A+A^\top$, we have that $A+A^\top$ is symmetric for all $A\in \mathbf{R}^{n\times n}$ and because $A$ is orthogonal to all symmetric matrices, this inner product is equal to $0$. But we also have $$0=\langle A,A+A^\top\rangle=\langle A,A\rangle+\langle A,A^\top\rangle \iff \langle A,A\rangle=\langle A,-A^\top\rangle$$ Here comes the problem: I want to deduce from this step that $A=-A^\top$, but I realise that this is not allowed.

How should I prove the second inclusion? I know that otherwise I can use something that uses the unique decomposition of $A\in\mathbf{R}^{n\times n}$ as sum of a symmetric and an antisymmetric matrix $A=\frac{A+A^\top}{2}+\frac{A-A^\top}{2}$, but I don't see how this would help.

rae306
  • 9,850

3 Answers3

2

You wish to show that for $A\in W^\bot$ you have $A+A^T = 0$. We have $$\begin{aligned} \langle A + A^T, A + A^T\rangle &=\langle A, A+A^T\rangle +\langle A^T, A+A^T\rangle\\ &=0+\langle A^T, A+A^T\rangle \end{aligned}$$ since $A + A^T$ is symmetric and $A\in W^\bot$. If you expand the second term you obtain $$\langle A^T, A+A^T\rangle = \text{Tr}\left(A(A+A^T)\right) = \text{Tr}\left((A+A^T) A\right) =\langle A+A^T, A\rangle = 0 $$ as well, where we used that $\text{Tr}(BC) = \text{Tr}(CB)$ for all matrices $B$ and $C$.

This shows that $\langle A + A^T, A + A^T\rangle=0$ and since $\langle\cdot,\cdot\rangle$ is an inner product this implies that $A + A^T=0$, which gives $A^T = -A$ as desired.

Jolien
  • 1,695
1

If you already know that $\langle\,\cdot\,,\,\cdot\,\rangle$ is an inner product (in particular, that it is positive definite), then you're essentially done after your first display equation: The equations implies that $V$ and $W$ are orthogonal and transverse to one another, but since $\dim V + \dim W = n^2 = \dim {\bf R}^2$, they are in fact orthogonal complements.

Travis Willse
  • 108,056
  • What do you mean by transverse? Could you formulate this into a theorem? The only theorem in my textbook says that $\dim U+\dim U^\perp=\dim V$ for $U$ a subspace of $V$, but I need to prove that one is the orthogonal complement of the other. – rae306 Nov 24 '17 at 15:26
  • Vector subspaces $V, W \subseteq X$ are transverse iff they have trivial intersection, that is, that $V \cap W = {0}$. They are complementary iff (1) they are transverse and (2) they span the whole space (so, $V + W = X$). Since $\dim (V + W) = \dim V + \dim W - \dim(V \cap W)$, transversality and the property $\dim V + \dim W = \dim X$ imply that $V + W$ is all of $X$ and thus that $V$ and $W$ are complementary. – Travis Willse Nov 24 '17 at 17:32
0

If the trace of $XA$ is $0$ for all symmetric matrices $X$ then one can consider an $X$ with all entries zero apart from $1$s in positions $(i,j)$ and $(j,i)$. In this way one shows that each $a_{ij}+a_{ji}=0$ and each $a_{ii}=0$.