I am trying to block diagonalize a real orthogonal matrix, A. The condition is that the blocks should also be orthogonal. I found this pretty old yet abstract paper that says "By block diagonalization methods one can obtain eigenvalues and eigenvectors while simultaneously “reducing” the size of the matrix, i.e., $\sigma(A) = \sigma(\Lambda_1) \cup \sigma(\Lambda_2) $"
However, I am not a math major and can not understand much of it. So I am asking here if it is possible or implemented somewhere to block diagonalize a matrix. The matrices that I want to block diagonalize are always real, square, and orthogonal (satisfying $A^T(AA^T)^{-1}A = 1$).
As a sample matrix, $ A = \begin{bmatrix} -0.874030050 & 0.634268244 & 0.001600016 & 0.000706214 \\\ 0.917305750 & 0.569888842 & 0.000391318 & -0.001513837 \\\ 0.001600016 & 0.000706214 & -0.874030050 & 0.634268244 \\\ 0.000391318 & -0.001513837 & 0.917305750 & 0.569888842\end{bmatrix}$
Say, I want two 2x2 blocks, along the main diagonal. How to obtain this?
EDIT1 Is it possible to obtain a matrix like this?
$ A = \begin{bmatrix} -0.874030050\pm c1 & 0.634268244\pm c2 & 0.0000001 & 0.00000004 \\\ 0.917305750\pm c3 & 0.569888842\pm c4 & 0.00000008 & -0.00000037 \\\ 0.000000016 & 0.000000014 & -0.874030050\pm c5 & 0.634268244\pm c6 \\\ 0.000000018 & -0.00000037 & 0.917305750\pm c7 & 0.569888842\pm c8\end{bmatrix}$ What it means is that the $c_i$ are some corrections to the desired matrix blocks and the rest are zero or negligible. The Jordan Normal form is an upper triangular matrix and this is not what I want.
I think it can be made possible with the paper that I am refering, but it is pretty old and I need an interpretation too, it contains a lot of terms that I do not know. Probably there are updated methods as well.
EDIT 2 I have read from here that Givens Rotation can be applied to obtain orthonormal submatrices. I have applied multiple Givens Rotation to the matrix provided above. I am providing the result below. However, the desired result has not yet been obtained. $ A = \begin{bmatrix} -0.909289121 & 0.598620192 & -0.001459999 & -0.0000000 \\\ 0.909289121 & 0.598620192 & 0.001459999 & -0.00000000 \\\ -0.00000000 & -0.000959999 & 0.909290293 & 0.598619423 \\\ -0.00000000 & -0.000959999 & -0.909290293 & 0.598619423\end{bmatrix}$
I have used only two rotations and got the above result. Performing more rotations causes more harm. Is there any modified algorithm to this?