4

Let $X_n$ be a set of $n\times n$ matrices with $[X_i, X_j] = X_i X_j - X_j X_i = 0$. A theorem by Schur shows they can be brought together to the form:

$$ X_i = \alpha \left[ \begin{matrix} I_{n/2} && M_i \\ 0 && I_{n/2} \end{matrix} \right]$$

With $M_i$ being an $\frac{n}{2} \times \frac{n}{2}$ matrix. How can one find the diagonalizing basis? How computationally hard is this calculation to perform?

Robert Z
  • 147,345
  • 1
    https://math.stackexchange.com/questions/590772/commuting-matrices-are-simultaneously-triangularizable is a reference. I'm a bit uncertain, but isn't it enough to put one of these matrices in such a form and the same transform would do the same to the rest? – Stefan Jun 20 '18 at 12:20
  • 1
    @Stefan There is a counter example for that: $\left( \begin{array}{cccc} 1 & 0 & 1 & 0 \ 0 & 1 & 0 & 0 \ 0 & 0 & 1 & 0 \ 0 & 0 & 0 & 1 \ \end{array} \right)$ and $\left( \begin{array}{cccc} 1 & 0 & 0 & 0 \ 0 & 1 & 0 & 0 \ 0 & 0 & 1 & 0 \ 0 & 1 & 0 & 1 \ \end{array} \right)$ – Yotam Vaknin Jun 20 '18 at 12:28
  • Some Idea's - We can find $\alpha$ by $tr[X_n]/n$. We can also find half of the basis solving the equation $X_n v = \alpha v$ for some $n$. – Yotam Vaknin Jun 20 '18 at 13:51

1 Answers1

1

After some work, I think I found something.

We can easily find $\alpha$, for every $i \le n$ we have $\alpha = tr[X_i]/n$.

after finding $\alpha$, we can normalize $X'_n = X_n/ \alpha $. I will assume $\alpha=1$ from now on.

There are at least $n/2$ vectors such that : $$ X_n v_i = v_i$$

We can find them by solving $(X_i - I)v$ for any $i\le n$. This is an $O(n^3)$ calculation as far as I can see.

Using the $n/2$ vectors we found, along with any $n/2$ independent vectors, all the $X_n$ matrices will be in the form: $$ P^{-1}X_iP = \left(\begin{matrix} I && M_i\\ 0 && S_i \end{matrix} \right)$$

With $S_i$ being a diagonalisable matrix with a single eigenvalue 1. We will get the final form of $X_i$ by mutually diagonalising $S_i$ (Which is a standard practice).