28

So I'm studying linear algebra and one of the self-study exercises has a set of true or false questions. One of the questions is this:

If $A^2 = I$ (Identity Matrix), then $A = \pm I$ ?

I'm pretty sure it is true but the answer says it's false. How can this be false (maybe it's a typography error in the book)?

5 Answers5

43

A simple counterexample is $$A = \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} $$ We have $A \neq \pm I$, but $A^{2} = I$.

20

In dimension $\geq 2$ take the matrix that exchanges two basis vectors ("a transposition")

Blah
  • 5,484
  • If you want to exchange the (standard) basis vectors $e_{i}$ and $e_{j}$ ($1 \leq i,j \leq n$), then use the matrix $A = [m_{ij}]$ with $m_{kk} = 1, k\neq i,j$, $m_{ij} = m_{ji} = 1$ and $m_{kl} = 0$ for all other values of $k$ and $l$. For example, if you want $e_2$ and $e_3$ exhanged in $\mathbb{R}^{3}$, take $$A = \begin{bmatrix} 1 & 0 & 0 \ 0 & 0 & 1 \ 0 & 1 & 0 \end{bmatrix}$$ It is clear that such a matrix always satisfies $A^2 = I$, since applying it twice always gets you back to where you started. – Martin Wanvik Feb 05 '12 at 21:01
  • Thank you @Martin Wanvik, pretty clear explanation. – Randolf Rincón Fadul Feb 05 '12 at 21:52
15

I know $2·\mathbb C^2$ many counterexamples, namely

$$A=c_1\begin{pmatrix} 0&1\\ 1&0 \end{pmatrix}+c_2\begin{pmatrix} 1&0\\ 0&-1 \end{pmatrix}\pm\sqrt{c_1^2+c_2^2\pm1}\begin{pmatrix} 0&-1\\ 1&0 \end{pmatrix},$$

see Pauli Matrices $\sigma_i$.

These are all such matrices and can be written as $A=\vec e· \vec \sigma$, where $\vec e^2=\pm1$.

Nikolaj-K
  • 12,559
7

The following matrix is a conterexample $ A = \left( {\begin{array}{cc} -1 & 0 \\ 0 & 1 \\ \end{array} } \right) $

azarel
  • 13,506
7

"Most" (read: diagonalizable) matrices can be viewed simply as a list of numbers -- its eigenvalues -- in the right basis. When doing arithmetic with just this matrix (or with other matrices that diagonalize in the same basis), you just do arithmetic on the eigenvalues.

So, to find diagonalizable solutions to $A^2 = I$, we just need to write down a matrix whose eigenvalues satisfy $\lambda^2 = 1$ -- and any such matrix will do.

When thinking about matrices in this way -- as a list of independent numbers -- it makes it easy to think your way through problems like this.

  • 1
    Every matrix satisfying $A^2=I$ is diagonalizable, because either it is $\pm I$ or its minimal polynomial is $(x-1)(x+1)$. The general solution is obtained by taking all diagonal matrices with entries $\pm 1$ on the diagonal and conjugating by invertible matrices. – Jonas Meyer Feb 06 '12 at 05:03
  • 2
    Jonas Meyer, this is only true if $char F \ne 2$. Otherwise, there are such matrices which are not diagonalizable, – the L Feb 06 '12 at 08:18
  • 1
    @Jonas: That's a good point to mention as an appendix, but dealing properly with non-diagonalizable matrices in this fashion is somewhat more sophisticated. The only reason I mentioned the word was so that I didn't mislead Randolf into thinking this method works (unmodified) for all matrices; e.g. that the argument I gave isn't sufficient to tell us that this (or any!) equation has only diagonalizable solutions. –  Feb 06 '12 at 09:53
  • 1
    @anonymous: Good point, e.g. $\begin{bmatrix}1&1\ 0&1\end{bmatrix}$. @@Hurkyl: I agree, it is best as an appendix. I appreciate your caution, but wanted to point out that your method does lead to the general solution (in the characteristic $0$ case that the OP is probably working in). – Jonas Meyer Feb 06 '12 at 15:49