2

Suppose $A \in \mathbb{Q}^{n \times n}$, with $n$ odd. Moreover, suppose $\lambda_1 = \bar{\lambda_2}$ are a complex conjuage pair of eigenvalues of $A$ (i.e. eigenvalues with nonzero complex component), both with modulus greater than $1$, and modulus greater than or equal to all other eigenvalues of $A$. Because $n$ is odd, $A$ has at least one real eigenvalue, and thus at least one eigenvector (in $\mathbb{R}^n$).

If we consider a geometric interpretation of how $A$ transforms $n$-Euclidean space, we can view it as a rotation plus expansion around an "axis" which is the line(s) spanned by the eigenvectors corresponding to real eigenvalues, and where the rotation comes from the complex eigenvalues.

Take $v \in \mathbb{Q}^n$, and let $\mathcal{L} = \text{span}(v)\subset \mathbb{R}^n$ label the line spanned by $v$. Does there exist a $k \in \mathbb{N}^+$ such that $A^k \mathcal{L} = \mathcal{L}$, where $A\mathcal{L}$ denotes the image of line $\mathcal{L}$ under $A$?

I would expect the answer to be "yes", because at least in the three-by-three case, we could consider such a matrix as rotating about a single axis (given by the real eigenvector), and if the complex eigenvalues have rational coefficients, then I would expect some form of periodicity, as in the case of the 2-by-2 rotation matrix by a rational angle.

  • what does the statement "Moreover, suppose $A$ has eigenvalues $\lambda_1\geq\lambda_2\geq...\geq\lambda_n$, with $\lambda_1 = \bar{\lambda_2}$ both complex" mean? Also what is "$|\lambda_1| =|\lambda_1|>1$" supposed to mean? Also the statement "because has rational entries so must its eigenvalues" – user8675309 Oct 24 '23 at 01:41
  • @user8675309 I edited the question. – user918212 Oct 24 '23 at 02:22
  • 1
    Why don't you take in a first attempt $n=3$ ? – Jean Marie Oct 24 '23 at 08:05
  • @JeanMarie Frankly I am not sure how to prove this, even for the three-by-three case. If you were able to provide a complete solution for even just that case I would be grateful :) – user918212 Oct 24 '23 at 13:31
  • @JeanMarie my main source of confusion (in the $n=3$ case), is that if the real eigenvalue has modulus other than 1, I would expect to see vectors scaled along the direction of the associated real eigenvector, not just rotated and scaled. I.e., I would not expect there to be "invariant cones", but this is what I see experimentally. – user918212 Oct 24 '23 at 13:50
  • In the case $n=3$, your matrix with two complex conjugate values $\rho e^{\pm i\theta}$ and a real one associated with eigenvector $U$ is of the form $A=\rho R$ where $R$ is the matrix of the rotation with axis directed by $U$ and with rotation angle $\theta$. Do you agree ? Therefore you will have $A^kV=V$ iff $\theta=2 \pi m /k$ for a certain integer $m$. – Jean Marie Oct 24 '23 at 17:18
  • 1
    @JeanMarie Your first step $A = \rho R$ with $R$ a rotation matrix about axis $U$ is not clear to me, and I think points to the gap in my understanding, because I am unclear why the real eigenvalue does not change this away from being a rotation about an axis. And what if $n$ is a larger odd number? I will add this is something I have tried to find in textbooks, but unsuccessfully, so I would appreciate recommendations there as well. – user918212 Oct 25 '23 at 02:32

1 Answers1

1

I realize now that I had assumed implicitely in my last comment that your matrix is already skew-symmetric, which is not the case.

Now that I have spoken about my misconception, I have realized that yours come mainly from the fact that (taking again the case $n=3$) a matrix with two complex conjugate eigenvalues and (therefore) a real eigenvalue is rather "distant" from a rotation matrix, as I will attempt to show.

In fact, there is a classical decomposition of any matrix under the form :

$$A=QR$$

classically called... "$QR$ decomposition" where $Q$ is an orthogonal matrix ($Q^T=Q^{-1}$, a rotation matrix if the eigenvalues are a real number and 2 complex conjugate values ; a symmetry matrix otherwise) and $R$ upper triangular.

Please note the order : $R$ is applied first, then rotation $Q$.

In very special cases, $R$ is diagonal : $R=diag(r_1,r_2,r_3)$ in which case, applying $A$ is equivalent to first scaling the 3 axis with these $r_1,r_2,r_3$ then applying the rotation. And in even more special cases $r_1=r_2=r_3=\rho$ which is the case I had too hastily privilegized.

But in general, upper triangular matrix $R$ induces an initial "scrambling" of the axes which makes things rather impossible to trace back, in particular regarding rotations.

Remark : in a different direction, the "closest orthogonal matrix" to a given matrix is given by the SVD decomposition ; see for example this question and its answer for a distance associated with the so-called Frobenius norm (see example below).

Let us take the example of

$$A = \pmatrix{ 0& 0&1\\ 1&0&1\\ 0&1& 0}$$

with real eigenvalue $1$ associated with $V=(1,-1,1)^T$.

Its QR factorization is

$$A=\underbrace{\pmatrix{ 0& 0&1\\ -1&0&0\\ 0&-1& 0}}_{\text{rotation matrix} \ Q} \underbrace{\pmatrix{ -1& 0&-1\\ 0&-1&0\\ 0&0& 1}}_{\text{triangular matrix} \ R}\tag{1}$$

The characteristic polynomial of $P$ and $Q$,

$$\chi_A(x)=x^3-x^2-x-1, \ \ \chi_Q(x)=x^3-1$$

are different : their complex eigenvalues aren't the same therefore the angle of rotation will not be the same. But $A$ and $Q$ share the same real eigenvalue $1$ with the same associated eigenspace $\mathbb{R}V$ (for the eigenspace, i.e., for the axis of rotation, it needn't be the same one in general).

Besides, the closest orthogonal matrix to $A$ in the sense of Frobenius norm (as said above) isn't $Q$ as defined in (1) but

$$Q'=\pmatrix{-a&0&2a\\2a&0&a\\0&1&0} \text{where} \ a:=\frac15 \sqrt{5}$$

with characteristic polynomial :

$$\chi_{Q'}(x)=x^3+ax^2-ax-1=(x-1)(x^2+(1+a)x+1).$$

Still the same real eigenvalue $1$, but with a different associated eigenspace directed by $(1,\Phi,\Phi)^T$ where $\Phi=\frac12(1+\sqrt{5})$ (golden ratio).

Jean Marie
  • 88,997