6

I tried to prove that a real antisymmetric matrix can be taken by an orthogonal tranformation to a form:

antisymmetric matrix transofrmation

where the eigenvalues are $\pm i\lambda_1, \pm i\lambda_2 ... $

which is a statement I saw on wikipedia in http://en.wikipedia.org/wiki/Antisymmetric_matrix

I also know an antisymmetric matrix can be diagonalized by a unitary transformation, and I found a unitary transformation taking the diagonal matrix to the required form.

So by composing the two transformations (diagonalization, then taking the diagonal matrix to the required form), I'll get a unitary transformation taking the real antisymmetric matrix to another real matrix.

My question is if this transformation must be a real matrix? if so I can deduce that the unitary transformation is in fact an orthogonal transformation.

So is this true?

Is a unitary transformation taking a real matrix to another real matrix necessarily an orthogonal transformation?

EDIT: After receiving in the comment here a counterexample, I'm adding:

Alternatively, if it is not necessarily orthogonal, does there necessarily exist an orthogonal transformation taking the two matrices to each other?

fiftyeight
  • 2,757
  • It can be imaginary (imagine that both your real matrices are $I_n$), but there always exists an orthogonal matrix that does the job. – PseudoNeo Aug 02 '12 at 19:09
  • If $A$ is an orthogonal transformation which diagonalizes $\Sigma$ and $U$ is an arbitrary non-real unitary transformation which leaves $A$ unchanged (e.g. does an arbitrary transformation in its null space only), then obviously $AU$ is a non-orthogonal unitary transformation which diagonalizes $\Sigma$ (because $U$ doesn't change it, and then $A$ diagonalizes it). Therefore what I think you actually want to know is whether it must be an orthogonal matrix up to transformations which don't change $\Sigma$. – celtschk Aug 02 '12 at 19:17
  • @celtschk yes you are right, I added this to my question – fiftyeight Aug 02 '12 at 19:19
  • After the edit, the answer is yes: unitarily equivalent real matrices are orthogonally equivalent. But I struggle to find a reference. – PseudoNeo Aug 02 '12 at 19:23

2 Answers2

2

Yes. Quoting Halmos's Linear algebra problem book (Solution 160).

“If $A$ and $B$ are real, $U$ is unitary, and $U^*AU = B$, then there exists a real orthogonal $V$ such that $V^*AV = B$.

A surprisingly important tool in the proof is the observation that the unitary equivalence of $A$ and $B$ via $U$ implies the same result for $A^*$ and $B^*$. Indeed, the adjoint of the assumed equation is $U^*A^*U = B^*$.

Write $U$ in terms of its real and imaginary parts $U = E + i F$. It follows from $AU = UB$ that $AE = EB$ and $AF = FB$, and hence that $A(E+\lambda F) = (E+\lambda F)B$ for every scalar $\lambda$. If $\lambda$ is real and different from a finite number of troublesome scalars (the ones for which $\det(E+\lambda F) = 0$), the real matrix $S = E + \lambda F$ is invertible, and, of course, has the property that $AS=SB$.

Proceed in the same way from $U^*A^*U = B^*$: deduce that $A^*(E+\lambda F) = (E+\lambda F)B^*$ for all $\lambda$, and, in particular, for the ones for which $E+\lambda F$ is invertible, and infer that $A^*S = SB^*$ (and hence that $S^*A = BS^*$).

Let $S =VP$ be the polar decomposition of $S$ (that theorem works just as well in the real case as in the complex case, so that $V$ and $P$ are real.) Since $$BP^2 = BS^*S = S^*AS = S^*SB = P^2B,$$ so that $P^2$ commutes with $B$, it follows that $P$ commutes with $B$. Since $$AVP = AS = SB = VPB = VBP$$ and $P$ is invertible, it follows that $AV=VB$, and the proof is complete.”

Needless to say, that isn't the shortest path to prove the reduction of antisymmetric matrices...

PseudoNeo
  • 10,358
  • 1
    I think you meant $A^*$ in the adjoint of the assumed equation. – celtschk Aug 02 '12 at 19:45
  • BTW, the first part of the proof shows that two real matrices which are similar over $\mathbb C$ are similar over $\mathbb R$. This result is also good to know. – PseudoNeo Aug 02 '12 at 19:47
  • thank you, only statement I'm not sure about: $P^2$ commutes with $B$ then $P$ commutes with $B$, is this a general statement or just true for $P$? – fiftyeight Aug 02 '12 at 20:44
1

Here is an answer that works for normal matrices, in particular for skew-Hermitian matrices.

Every normal matrix is unitarily diagonalizable. So there is an ONB (orthonormal basis) consisting of eigenvectors. You can replace this complex-entried ONB by a real-entried ONB that replaces $2\times 2$ diagonal blocks with conjugated eigenvalues by $2\times 2$ block of the kind you want by using the following observation: If $u\in\mathbb{C}^n$ is an eigenvector corresponding to eigenvalue $\lambda i$, then write $u=x+iy$ where $x, y\in \mathbb{R}^n$. Since the matrix is real, it is then almost immediate that $x,y$ are perpendicular and of length $1/\sqrt{2}$. Using this, you can then replace the original complex-entried ONB by a real-entries ONB.

ViHdzP
  • 4,854
  • 2
  • 20
  • 51
Mitja
  • 11