3

Suppose that $A$ is a complex matrix satisfying $A^TA = I$ (so $A$ is the entrywise transpose, not the conjugate transpose). What can be said about the eigenvalues of $A$, if $A$ is "complex-orthogonal" in this sense?

Of course, for any eigenpair $(\lambda,x)$, we have $$ x^Tx = x^TA^TAx = (Ax)^TAx = \lambda^2 (x^Tx) $$ which allows us to conclude that $\lambda^2 = 1$... so long as $x^Tx \neq 0$. Can anything else be said? Does the case in which $A$ has real entries allow us to conclude that $|\lambda| = 1$?

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • For the first question, what about $x^x=x^A^Ax=(Ax)^Ax=\overline{\lambda}\lambda x^x=|\lambda|^2x^x$, so $|\lambda|^2=1$? – Gerry Myerson May 21 '17 at 12:25
  • @GerryMyerson $A^$ is the adjoint of $A$ relative to the Hermitian inner product, i.e. $$ \langle x, y \rangle = y^x = \sum_{j=1}^n x_j \overline{y_j} $$ the point is to avoid this. – Ben Grossmann May 21 '17 at 12:46
  • For any of you interested in the (now gone) first part of this question, I've asked it separately over here. – Ben Grossmann May 21 '17 at 13:38
  • $A^*$ is the matrix you get from $A$ by flipping it across the main diagonal and taking the complex conjugate. You can do both those actions without ever having heard of inner products. – Gerry Myerson May 21 '17 at 23:08
  • @GerryMyerson sure, I guess you could. It would a pretty unintuitive thing to do, though. Why, for instance, should we expect that $(AB)^* = B^A^$? – Ben Grossmann May 22 '17 at 00:41
  • We know $(AB)^t=B^tA^t$, and we know $\overline{AB}=\overline{A},\overline{B}$, and it follows immediately from that. – Gerry Myerson May 22 '17 at 03:11
  • @GerryMyerson okay, and I suppose that $(AB)^T = B^TA^T$ is supposed to be derived without any mention of inner products or dual spaces, then, as a neat thing that happens when you reflect matrices. – Ben Grossmann May 22 '17 at 03:16
  • If $AB=C$, then $C_{ij}=\sum_kA_{ik}B_{kj}$, also $(A^t){ij}=A{ji}$, is all you need to prove $(AB)^t=B^tA^t$. No inner products, no dual spaces, just the definition of matrix multiplication and the definition of transpose. – Gerry Myerson May 22 '17 at 03:24
  • @GerryMyerson I agree that it's possible, I just don't like it, since it makes the transpose into an unintuitive "trick". – Ben Grossmann May 22 '17 at 03:29

5 Answers5

2

This has been thoroughly studied in the paper "The Jordan Canonical Forms of complex orthogonal and skew-symmetric matrices" by Horn and Merino (1999) and also in Olga Ruff's master thesis "The Jordan Canonical Forms of complex orthogonal and skew-symmetric matrices: characterisation and examples" (2007). In particular, theorem 1.2.3 (pp. 31-32) of Ruff's thesis states that

An $n\times n$ complex matrix is similar to a complex orthogonal matrix if and only if its Jordan Canonical Form can be expressed as a direct sum of matrices of only the following three types:

(a) $J_k(\lambda)\oplus J_k(\lambda^{-1})$ for $\lambda\in\mathbb C\setminus\{0\}$ and any $k$,

(b) $J_k(1)$ for any odd $k$ and

(c) $J_k(-1)$ for any odd $k$.

(Note that when $\lambda=\pm1$, the matrix in (a) is just two copies of $J_k(\lambda)$. Hence case (a) implies that an even-sized Jordan block for $\lambda=\pm1$ must appear an even number of times, while (a), (b) and (c) together allows an odd-sized Jordan block for $\lambda=\pm1$ to appear any number of times.)

In particular, every nonzero complex number is an eigenvalue of some complex orthogonal matrix, and for each complex orthogonal matrix, all eigenvalues $\ne\pm1$ must occur in reciprocal pairs.

user1551
  • 149,263
  • Interestingly Ruff and Horn and Merino seem inequivalent. Horn and Merino further explicitly exclude the possibility of $J_k(\pm1) \oplus J_k(\pm 1)$ for odd $k$ (theorem 1), whereas Ruff (quoted here) does not. – ComptonScattering Mar 03 '21 at 15:30
  • @ComptonScattering $J_k(1)\oplus J_k(1)$ for odd $k$ is allowed in Horn and Merino. It is the direct sum of two matrices of type (d). The authors just state the same theorem differently. It might be clearer to say that even-sized Jordan blocks for $\lambda=\pm1$ must occur an even number of times but odd-sized Jordan blocks for $\lambda=\pm1$ can occur any number of times. – user1551 Mar 03 '21 at 15:45
  • Yes you are correct. – ComptonScattering Mar 03 '21 at 15:48
  • Ah, I see. ${}{}$ – coiso Sep 25 '23 at 18:02
1

$$A=\pmatrix{\frac{a+a^{-1}}2&i\frac{a-a^{-1}}2\\ -i\frac{a-a^{-1}}2&\frac{a+a^{-1}}2}$$ has $A^tA=I$ and has $a$ and $a^{-1}$ as eigenvalues.

As an example $a=2$ gives $$A=\pmatrix{\frac54&\frac34i\\-\frac34i&\frac54}$$ and $$A^tA=\pmatrix{\frac54&-\frac34i\\\frac34i&\frac54} \pmatrix{\frac54&\frac34i\\-\frac34i&\frac54}=\pmatrix{1&0\\0&1}.$$

Angina Seng
  • 161,540
  • That answers part of the second half of my question, so thanks for that. Still, I'm looking for something more comprehensive. – Ben Grossmann May 21 '17 at 13:01
  • @Omnomnomnom For even $n$ the quadratic form $z_1^2+\cdots+z_n^2$ is equivalent to $w_1w_2+w_3w_4+\cdots$. Is this a useful step towards your "comprehensive" vision? – Angina Seng May 21 '17 at 13:07
  • About your other comment, I suppose that depends on what exactly an "equivalence" of quadratic forms refers to in this case – Ben Grossmann May 21 '17 at 13:15
  • @Omnomnomnom For equivalence of quadratic forms see Wikipedia https://en.wikipedia.org/wiki/Quadratic_form – Angina Seng May 21 '17 at 13:30
  • You're right about the matrix... it was quick to check by hand after all. I'm surprised that W|A made a mistake there; I don't think I have a typo. – Ben Grossmann May 21 '17 at 13:33
  • I've changed the question to suit your answer. Thanks for your time and effort. – Ben Grossmann May 21 '17 at 13:59
1

At the risk of stating the obvious, as it hasn't been explicitly stated here, I add that:

  1. Each eigenvalue is either (a) one of a pair of eigenvalues $(\lambda, \lambda^{-1})$, or (b) $\lambda = \pm 1$.
  2. The product of all the eigenvalues satisfies $\prod_i \lambda_i = \pm 1$.

Both of these claim are straightforward to show:

  1. Let $A x = \lambda x$ define the left eigenvector $x$, then by taking the transpose it follows that $x^T A = x^T \lambda^{-1}$, i.e. $x^T$ is a right eigenvector with eigenvalue $\lambda^{-1}$. If (a) $x^T x = 0$ then these are distinct eigenvalues and $\lambda$ may take any value, if (b) $x^T x \neq 0$ then they are the same eigenvalue, and we require $\lambda = \lambda^{-1}$, i.e. $\lambda = \pm 1$.
  2. Note $1 = \det(I) = \det(A A^T) = \det(A)\det(A^T)= \det(A)^2$. This implies $\det(A) = \prod_i \lambda_i = \pm 1$.
0

I have some simple ideas:

Let $A$ be an orthogonal matrix $i.e.$ $A^TA=AA^T=I$, from the very definition of the orthogonal matrix, $A$ and $A^T$ are both inverses of each other. So the definition says that their eigenvalues are reciprocals. $i.e.$ if $a$ $\in \mathbb{C}$ is an eigenvalue of $A$ then $\frac{1}{a}$ is an eigenvalue of $A^T$. But we know that the eigenvalues of $A$ and $A^T$ are same. So $a=\frac{1}{a}$ implies $a^2=1$. If we take $a=a_1+ia_2$ then $a_1^2-a_2^2=1$, $a_1a_2=0$ which implies modulus of $a$ is $1$.

  • 1
    Noting that $A$ has the same eigenvalues as $A^{-1}$ is not enough to deduce that any eigenvalues satisfy $a = (1/a)$. For instance, note that $$ A = \pmatrix{2&0\0&1/2} $$ has the same eigenvalues as its inverse. – Ben Grossmann May 21 '17 at 13:25
  • ohh... I see. I made a mistake doing $a=\frac{1}{a}$. thank you – Hirakjyoti Das May 21 '17 at 13:29
0

To clarify the result from the existing answer: we can get a complex-orthogonal matrix with eigenvalues $a,a^{-1}$ with $$ A = \frac 12 \pmatrix{a+a^{-1} & i(a - a^{-1})\\ -i(a - a^{-1}) & a + a^{-1}} = \pmatrix{i&-i\\1&1} \pmatrix{a\\&a^{-1}}\pmatrix{i&-i\\1&1}^{-1} $$

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355