2

Suppose that $A$ is a $3\times 3$ real orthogonal matrix and the characteristic polynomial of $A$ is $(x+1)(x-1)^2$ . Prove that $A$ is symmetric.


I know that $A$ is a real normal matrix with real eigenvalues and hence symmetric, see A normal matrix with real eigenvalues is Hermitian .

But I think that's overkill. Since we did not even use the conditions on eigenvalues. So is there an elementary method to tackle this question? Thank you.

Bach
  • 5,874
  • 2
  • 22
  • 44
  • "Eigenvalues are $1,-1,-1$" - does it mean that you know that there are three linearly independent eigenvectors? – A.Γ. Aug 02 '19 at 06:47
  • @A.Γ. Of course not. The geometric multiplicity is less than or equal to the algebraic multiplicity. – Bach Aug 02 '19 at 07:54

3 Answers3

5

Note that $A^2$ is real and orthogonal too, with $1$ its only eigenvalue. The only such matrix is $I$. Therefore, $$A^T = A^T I = A^T AA = IA = A.$$

Theo Bendit
  • 53,568
  • 1
    So the $3\times3$ hypothesis is superfluous. – Gerry Myerson Aug 02 '19 at 03:29
  • Why the only such matrix is $I$? – Bach Aug 02 '19 at 03:31
  • @Bach Honestly, I was hoping you wouldn't ask this, as this gets into what you're prepared to assume. You wanted something more elementary than spectral theorem, but I'm not sure exactly who you're pitching this proof to. Orthogonal matrices are normal, and hence diagonalisable, which should be sufficient. Showing $A$ is diagonalisable would also work similarly. You could also argue from $A$ representing an isometry, and using some elementary Euclidean geometry; it's long, but I think it's doable. I think if you want more than this, you might have to wait for someone else to answer. :-) – Theo Bendit Aug 02 '19 at 03:34
  • @TheoBendit I get it. Thank you! – Bach Aug 02 '19 at 03:40
3

Here is a proof without using any spectral property of orthogonal or normal matrices, but it only applies because $A$ is $3\times3$.

By assumption, $1$ and $-1$ are eigenvalues of $A$. Therefore there exist real unit vectors $x$ and $y$ such that $Ax=x$ and $Ay=-y$. Then $y^Tx=y^TIx=y^TA^TAx=-y^Tx$. Hence $y^Tx=0$, i.e. $x\perp y$. Complete $\pmatrix{x&y}$ to a real orthogonal matrix $Q=\pmatrix{x&y&z}$. Since $A$ and $Q$ are real orthogonal, so is $AQ=\pmatrix{x&-y&Az}$. Hence $Az$ is orthogonal to both $x$ and $y$ and it must be a scalar multiple of $z$. In other words, $z$ is also an eigenvector of $A$. Thus $A$ has an orthonormal eigenbasis over $\mathbb R$, meaning that $A$ is real symmetric.

user1551
  • 149,263
  • 1
    If $Ax=\lambda x$ and $y\in x^\bot$ then $Ay\in x^\bot$ too as $$\langle x,Ay\rangle=\lambda^{-1}\langle Ax,Ay\rangle=\lambda^{-1}\langle x,y\rangle=0.$$ Thus $x^\bot$ is an invariant subspace for $A$, and one can do induction on the dimension of $x^\bot$ for $n\times n$ orthogonal matrices as well. – A.Γ. Aug 02 '19 at 19:03
  • 1
    @Bach A.Γ. is correct. What he/she means is not that every $y\in x^\perp$ is an eigenvector of $A$, but that the restriction of $A$ on $x^\perp$ is orthogonal and so you may prove the statement recursively. This is a very fine argument that is (better than mine and) applicable to a general $n\times n$ orthogonal matrix with a real spectrum. I deliberately avoided that kind of argument because I thought you want a quick and very elementary proof. – user1551 Aug 03 '19 at 09:24
  • @user1551 Oops, thank you for the clarification! – Bach Aug 03 '19 at 10:21
0

Since $A$ is orthogonal and has real eigenvalues hence must be diagonalizable which implies that minim polynomial must be $m(t)=(t-1)(t+1)=t^2-1$.

So $A^2-I=0\implies A^2=I\implies AA=I$.....(1)

By orthogonality of $A$, we already have $AA'=I$.....(2)

From (1) and (2), $A=A'$ follows.

Nitin Uniyal
  • 8,108