For every orthogonal matrix $Q$ over the reals there is an orthogonal matrix $P$ and a block diagonal matrix $D$ such that $D=PQP^{t}$. Each block in D is either $(1)$, $(-1)$ or a two dimensional block of the form $\left( \begin{array}{cc} \cos(\alpha) & -\sin(\alpha) \\ \sin(\alpha) & \cos(\alpha) \\ \end{array} \right) $. Is there a constructive proof for this fact? Maybe a code in Matlab or sage?
- 23,223
- 263
-
1I think, (a suitable version of) the Gauss' elimination procedure might work, and it can diagonalize $Q$ over $\Bbb C$. – Berci Jun 30 '13 at 10:27
1 Answers
You could go with a real Schur decomposition, which is a constructive proof. There is one here: it constructs a single $1 \times 1$ or $2 \times 2$ diagonal block and then goes on inductively.
After you do that, use the orthogonality to show that $T$ (the triangular factor from the Schur decomposition) is orthogonal quasidiagonal (it has blocks of order at most $2$).
Next, it is easy to show that blocks themselves must be orthogonal. The blocks of order $1$ are trivial. As for the blocks of order $2$, go with the most general form: $$U = \begin{bmatrix} u_{11} & u_{12} \\ u_{21} & u_{22} \end{bmatrix}.$$ Now, use that $U^TU = {\rm I}_2$ and $UU^T = {\rm I}_2$, so \begin{align*} \begin{bmatrix} u_{11}^2 + u_{21}^2 & u_{11} u_{12} + u_{21} u_{22} \\ u_{11} u_{12} + u_{21} u_{22} & u_{12}^2 + u_{22}^2 \end{bmatrix} = \begin{bmatrix} 1 \\ & 1 \end{bmatrix}, \\ \begin{bmatrix} u_{11}^2 + u_{12}^2 & u_{11} u_{21} + u_{12} u_{22} \\ u_{11} u_{21} + u_{12} u_{22} & u_{21}^2 + u_{22}^2 \end{bmatrix} = \begin{bmatrix} 1 \\ & 1 \end{bmatrix}. \end{align*} From the diagonal elements, you see right away that $u_{12}^2 = u_{21}^2$. Since all is real, $u_{12} = \pm u_{21}$. Then use the nondiagonal elements to get that $u_{11} = \mp u_{22}$. For now, assume that $u_{11} = u_{22} =: c$, so $u_{21} = -u_{12} =: s$ and quickly show that $c$ and $s$ are sine and cosine of some angle $\varphi$.
What is left is the case when $u_{11} = -u_{22}$ (so $u_{12} = u_{21}$). In this case, just compute the eigenvalues (you have the formulas here) and you'll see that they are $1$ and $-1$, which is a contradiction with the fact (from the construction of the real Schur decomposition) that the blocks of order $2$ have complex conjugate pairs of eigenvalues.
- 11,592
-
It is a good idea, but above the block diagonal you don't have zeros. Is it always possible to obtain the "canonical form" with the three type of blocks and zero elsewhere? Is it possible to write an algorithm for that? – Oliver Jul 01 '13 at 19:28
-
You have zeros. Whenever you apply the Schur decomposition to any normal matrix (orthogonal being a very special case of normal matrices), you get a diagonal form (quasidiagonal, since you want everything to remain real). Read the second paragraph of my reply: for a quasitriangular $T$, from $TT^T = T^TT = {\rm I}$ follows that $T$ is quasidiagonal. It's easy for an orthogonal $T$ (which it is, since the Schur decomposition preserves the matrix structure). – Vedran Šego Jul 01 '13 at 19:42
-
This is, of course, a theory. In practic, some errors may occur and you may get very small (around machine precision) off-diagonal elements, but these are sorted out easily. I'm not sure if you just set them to zero or apply some further minor correction to the diagonal blocks. Search for the actual real Schur algorithm yourself; I don't have my books nearby at the moment, so I can't look it up. – Vedran Šego Jul 01 '13 at 19:43
-
1You are right, for normal matrices (in theory) you get zeros and this would solve the problem. However, I was not able to find a constructive proof of the (real) Schur decomposition. Here the important word is "constructive": the proofs (that I was able to find) assume the knowledge of an eigenpair. – Oliver Jul 03 '13 at 09:26
-
Schur decomposition, which is what you're after, reveals eigenvalues (and, in case of normal matrices, orthogonal set of their eigenvectors), but doesn't have a non-iterative algorithm. So, you either have to assume that you know an eigenpair and construct from there (I consider this a constructive proof, although it cannot be translated to an algorithm), or you have to do iterations, with nothing constructive in them. – Vedran Šego Jul 03 '13 at 11:12
-
This is just the nature of the eigenvalue problem. If I remember right, it was proven that the SVD cannot be done in a noniterative way. This automatically translates to the eigenvalue problem for positive definite matrices, so the general eigenvalue problem also cannot be solved in a noniterative fashion, hence the Schur decomposition cannot be done noniteratively. – Vedran Šego Jul 03 '13 at 11:12