2

I am trying to block diagonalize a real orthogonal matrix, A. The condition is that the blocks should also be orthogonal. I found this pretty old yet abstract paper that says "By block diagonalization methods one can obtain eigenvalues and eigenvectors while simultaneously “reducing” the size of the matrix, i.e., $\sigma(A) = \sigma(\Lambda_1) \cup \sigma(\Lambda_2) $"

However, I am not a math major and can not understand much of it. So I am asking here if it is possible or implemented somewhere to block diagonalize a matrix. The matrices that I want to block diagonalize are always real, square, and orthogonal (satisfying $A^T(AA^T)^{-1}A = 1$).

As a sample matrix, $ A = \begin{bmatrix} -0.874030050 & 0.634268244 & 0.001600016 & 0.000706214 \\\ 0.917305750 & 0.569888842 & 0.000391318 & -0.001513837 \\\ 0.001600016 & 0.000706214 & -0.874030050 & 0.634268244 \\\ 0.000391318 & -0.001513837 & 0.917305750 & 0.569888842\end{bmatrix}$

Say, I want two 2x2 blocks, along the main diagonal. How to obtain this?

EDIT1 Is it possible to obtain a matrix like this?

$ A = \begin{bmatrix} -0.874030050\pm c1 & 0.634268244\pm c2 & 0.0000001 & 0.00000004 \\\ 0.917305750\pm c3 & 0.569888842\pm c4 & 0.00000008 & -0.00000037 \\\ 0.000000016 & 0.000000014 & -0.874030050\pm c5 & 0.634268244\pm c6 \\\ 0.000000018 & -0.00000037 & 0.917305750\pm c7 & 0.569888842\pm c8\end{bmatrix}$ What it means is that the $c_i$ are some corrections to the desired matrix blocks and the rest are zero or negligible. The Jordan Normal form is an upper triangular matrix and this is not what I want.

I think it can be made possible with the paper that I am refering, but it is pretty old and I need an interpretation too, it contains a lot of terms that I do not know. Probably there are updated methods as well.

EDIT 2 I have read from here that Givens Rotation can be applied to obtain orthonormal submatrices. I have applied multiple Givens Rotation to the matrix provided above. I am providing the result below. However, the desired result has not yet been obtained. $ A = \begin{bmatrix} -0.909289121 & 0.598620192 & -0.001459999 & -0.0000000 \\\ 0.909289121 & 0.598620192 & 0.001459999 & -0.00000000 \\\ -0.00000000 & -0.000959999 & 0.909290293 & 0.598619423 \\\ -0.00000000 & -0.000959999 & -0.909290293 & 0.598619423\end{bmatrix}$

I have used only two rotations and got the above result. Performing more rotations causes more harm. Is there any modified algorithm to this?

Pro
  • 71
  • 1
    For general matrices, the block diagonalisation of a matrix can be described by the Jordan normal form. See also the section "Invariant subspace decompositions" therein. – DominikS Jan 24 '24 at 12:18
  • 1
    For the numerics, maybe this post can help. – DominikS Jan 24 '24 at 12:27
  • I was looking at the Jordan Normal form, which is not exactly what I want. However, I tried to implement that, and I think there might be some problem with the example provided with sympy as it runs for hours and does not output anything. However, I didn't know much about Jordan Normal form and thank you for your help. Could you please have a look at the edit and let me know what you think? – Pro Jan 26 '24 at 10:47
  • 1
    I see, yes, it makes sense! – DominikS Jan 26 '24 at 12:08
  • 1
    Generally, there always are subspaces that decompose the matrix if it has more than one eigenvalue: Consider a disjoint decomposition of the spectrum $\sigma(A) = \sigma_1 \dot\cup \sigma_2$. Then there are subspaces $V_1\oplus V_2 = V$ such that $A$ maps $V_j$ into $V_j$, and the restriction of $A$ on $V_j$ satisfies $\sigma(A|_{V_j}) = \sigma_j$. The $V_j$ would be the (generalized) eigenspaces, and it should be possible to extract this from the Jordan normal form (the columns of the transformation matrix). But I have to admit I'm out of my depth here, maybe someone else knows more. – DominikS Jan 26 '24 at 12:17
  • 1
    you wrote "I am trying to block diagonalize a real orthogonal matrix, A. The condition is that the blocks should also be orthogonal". What does the second sentence mean? They should be orthogonal matrices or orthogonal to each other? In general Jordan Forms are not going to be helpful. I suspect the Real Schur Decomposition will be though -- ref e.g. here https://math.stackexchange.com/questions/4320723/real-schur-decomposition-of-orthogonal-matrix – user8675309 Jan 26 '24 at 19:14
  • 1
    The equality $A^T(AA^T)^{-1}A = 1$ holds for any non-singular matrix. A matrix $A$ is orthogonal, if $A^TA=1$. – Alex Ravsky Jan 27 '24 at 09:38
  • @AlexRavsky Yes, you are right. That was my mistake. – Pro Jan 27 '24 at 10:13
  • @user8675309 I wanted to say that each will be an orthogonal matrix. " They should be orthogonal matrices". Yes. "orthogonal to each other". No – Pro Jan 27 '24 at 10:15

1 Answers1

1

As I understood Exercises to Chapter 4 of [Bel], we can proceed as follows. Let $N\ge 2$ be a natural number, $A$ be a $N\times N$ orthogonal matrix, and $\lambda\ne\pm 1$ be an eigenvalue of $A$ such that $|\lambda|=1$. Let the corresponding eigenvector is $x+iy$, where $x$ and $y$ are real. Then the vectors $x$ and $y$ are orthogonal. Now let $T_1=(x,y,x^3,x^4,\dots,x^N)$ be an orthogonal matrix whose two columns are $x$ and $y$. Then
$$T_1^TAT_1=\begin{Vmatrix} \begin{pmatrix} \cos\varphi_1 & - \sin\varphi_1 \\ \sin\varphi_1 & \cos\varphi_1 \end{pmatrix} & 0\\ 0 & A_{N-2}\end{Vmatrix}, $$
where the matrix $A_{N-2}$ is orthogonal too. Proceeding by induction, we can build an orthogonal matrix $T$ such that $T^TAT$ is a block-diagonal matrix whose blocks are $\begin{pmatrix} \cos\varphi_k & - \sin\varphi_k \\ \sin\varphi_k & \cos\varphi_k \end{pmatrix}$ or $(\pm 1)$.

References

[Bel] Richard Bellman, Introduction to matrix analysis, McGraw-Hill book company, Inc., New York-Toronto-London, 1960, Russian translation, Nauka, Moscow, 1969.

Alex Ravsky
  • 106,166