3

Consider a matrix $a\in M_n(M_n(\mathbb{C}))$ such that each entry is an orthogonal projection $a_{ij}=a_{ij}^2=a_{ij}$ and the sum along any row or column is the identity: $$\sum_{k=1}^n a_{ik}=I_n=\sum_{k=1}^na_{kj}.$$ Necessarily the entries along a row or column are mutually orthogonal (this is a general fact about partitions of unity in unital $\mathrm{C}^*$-algebras. Orthogonal projections with $\sum P_i =I$, proving that $i\ne j \Rightarrow P_{j}P_{i}=0$). Suppose we want to do this such that each $a_{ij}$ is of rank one, say associated to a one dimensional subspace $\mathbb{C}x_{ij}$. Where $\widehat{x_{ij}}=x_{ij}/\|x_{ij}\|$. $$a_{ij}(y)=\langle \widehat{x_{ij}},y\rangle \widehat{x_{ij}}\qquad (x_{ij},y\in\mathbb{C}^n).$$

Note that such an $a$ also gives (non-injectively) a matrix of vectors $x^a\in M_n(\mathbb{C}^n)$: $x^a=[x_{ij}]_{i,j=1}^n$. On the other hand, if we start with a matrix $x\in M_n(\mathbb{C}^n)$ of vectors such that every row and column is an orthogonal basis of $\mathbb{C}^n$ then we get such an $a^x$ via the formula above.

Using the usual notation for basis vectors $e_1,e_2,\dots,e_n\in \mathbb{C}^n$, at $n=1$ we just get: $$a=[I_1],\,x^a=[e_1].$$ At $n=2$ there are lots of ways. Here is one. Write $e_1$ and $e_2$ in column one:

$$x^a=\begin{bmatrix}e_1 & *\\ e_2& *\end{bmatrix}.$$ Now rotate the basis by an orthogonal matrix $r_1\in O(2)$: $$x^a=\begin{bmatrix}e_1 & r_1e_1\\ e_2 & r_1e_2\end{bmatrix}$$ There are two choices of $r_1\in O(2)$ that make the rows orthogonal bases too.

Let us look at the same idea for $n=3$ with rotations $r_1,r_2\in O(3)$:

$$x^a=\begin{bmatrix} e_1 & r_1e_1 & r_2e_1 \\ e_2 & r_1e_2 & r_2 e_2 \\ e_3 & r_1e_3 & r_2 e_3 \end{bmatrix}$$ This works with $$r_1=\begin{bmatrix}0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0\end{bmatrix},$$ and $r_2=r_1^2$, to give: $$x^a=\begin{bmatrix} e_1 & e_3 & e_2 \\ e_2 & e_1 & e_3 \\ e_3 & e_2 & e_1 \end{bmatrix}$$ We shouldn't need to use rotations because we can do this with orthogonal rows and columns rather than orthonormal necessarily.

Suppose in addition that we want as much non-commutativity as possible: $$[a_{ij},a_{kl}]=0\iff i=k\text{ or }j=l.$$ It is possible to show that this is impossible for $n=1,2,3$. Any such matrix has commuting entries.

I understand the general problem of finding all such matrices $x$ for $n\geq 4$ may in fact be quite difficult, and related to harder problems, and to things like Hadamard matrices.

But I am only really looking for one such example (well, at each $n$). I am wondering is it possible using rotations OR perhaps its difficulty can be illustrated by showing that it is not possible with rotations in the following way. An existence proof would do the trick if a construction is impossible.

I don't know so much about rotations... would it work for a rotation of $2\pi/n$ about the hyperplane $x_1+x_2+\cdots+x_n=0$? Does that even make sense?

Question: Let $n\geq 4$. Denote the map $y\to \langle x,y\rangle x$ by $|x\rangle\langle x|$ ($y,x\in \mathbb{C}^n$). Does there exist a rotation $r\in O(n)$ such that for $$x=\begin{bmatrix} e_1 & re_1 & r^2e_1 & \cdots & r^{n-1}e_1 \\ e_2 & re_2 & r^2 e_2 & \cdots & r^{n-1}e_2 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ e_n & re_n & r^2e_n &\cdots & r^{n-1}e_{n} \end{bmatrix}.$$ the following matrix is such that: $$a^x=\begin{bmatrix} |e_1\rangle\langle e_1| & |re_1\rangle\langle re_1| & |r^2e_1\rangle\langle r^2e_1| & \cdots & |r^{n-1}e_1\rangle\langle r^{n-1}e_1| \\ |e_2\rangle\langle e_2| & |re_2\rangle\langle re_2| & |r_2e_2\rangle\langle r^2e_2| & \cdots & |r^{n-1}e_2\rangle\langle r^{n-1}e_2| \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ |e_n\rangle\langle e_n| & |re_n\rangle\langle re_n| & |r^2e_n\rangle\langle r^2e_n| &\cdots & |r^{n-1}e_n\rangle\langle r^{n-1}e_n| \end{bmatrix}$$

  • the sum on every row is the identity (I presume we get the sum on every column is the identity for free), AND
  • only entries on the same row or column commute.

For the letter I think we require that, for $i\neq k$, $j\neq l$, that the one dimensional subspaces associated to $a_{ij}$ and $a_{kl}$ are neither orthogonal nor equal: $$\langle r^{j-1}e_i,r^{l-1}e_k\rangle\neq 0\text{ AND }\mathbb{C}r^{j-1}e_i\neq \mathbb{C}r^{l-1}e_k.$$

An answer for $n=4$ would be of some interest, but really I am interested in the existence of such an $r\in O(n)$ for all $n\geq 4$ or a reason why for some $n\geq 4$ no such $r$ exists.

I guess the next question is can this be done with a unitary matrix $u\in U(n)$.

Background: Such a matrix $x$ is said to be a magic basis for $\mathbb{C}^n$. A matrix $a\in M_n(\mathcal{A})$ with entries in a unital $\mathrm{C}$-algebra such that each entry is a(n orthogonal) projection and row and column sums are equal to the unit are said to be magic unitaries. Magic unitaries are central to the study of quantum permutation groups. Of interest therein is the univeral $\mathrm{C}^*$-algebra generated by the entries of a magic unitary $u$. If there is a positive answer to the question above, a monomial of entries from such an $a\in M_n(M_n(\mathbb{C}))$ given by $x\in M_n(\mathbb{C}^n)$ with orthonormal rows and columns is given by:

$$a_{i_1j_1}a_{i_2j_2}\cdots a_{i_mj_m}=|x_{i_1j_1}\rangle\langle x_{i_1j_1}|x_{i_2j_2}\rangle\langle x_{i_2j_2}|\cdots|x_{i_mj_m}\rangle\langle x_{i_mj_m}|,$$ and this can only be zero for trivial reasons (a consecutive pair of $a_{i_kj_k}a_{i_{k+1}j_{k+1}}$ are on the same row or column.)

This will allow us to conclude via the universal property a similar statement for the entries of the $u$ with entries in the universal $\mathrm{C}^*$-algebra mentioned in the preceding paragraph.

JP McCarthy
  • 8,209
  • When you say "Necessarily the entries along a row or column are mutually orthogonal.", this necessity isn't evident for me... – Jean Marie Feb 06 '23 at 17:05
  • @JeanMarie this is a general fact about partitions of unity in unital $\mathrm{C}^*$-algebras. https://math.stackexchange.com/questions/117702/orthogonal-projections-with-sum-p-i-i-proving-that-i-ne-j-rightarrow-p – JP McCarthy Feb 06 '23 at 17:06
  • Thanks for your immediate answer. – Jean Marie Feb 06 '23 at 17:07

1 Answers1

0

Following on from comments of David Roberson in the linked question, I have a solution to the general problem. Here I present the answer without proof. Let $w=\exp(2\pi i/n)$:

$$\xi_{ij}=\frac{1}{\sqrt{n}}\begin{bmatrix} w^{(n-1)(1-j)} \\ w^{1(i-j)} \\ w^{2(i-j)} \\ w^{3(i-j)} \\ \vdots \\ w^{(n-2)(i-j)} \\ w^{(n-1)(i-1)} \end{bmatrix}$$

A Fourier type vector with a twist of the first and last components. This construction, as expected, does not work for $n<4$, and in fact does not work at $n=4$.

Below find a previous answer at $n=4$.

This matrix $x$ here is such that if we form a matrix $$[a^x_{ij}]_{i,j=1}^4=[|x_{ij}\rangle \langle x_{ij}|]_{i,j=1}^4,$$ then it is a magic unitary with rank one projections that is maximally non-commutative as described above.

$$x:= \left[\begin{array}{cccc} \left[\begin{array}{c} 1 \\ 0 \\ 0 \\ 0 \end{array}\right] & \left[\begin{array}{c} 0 \\ \frac{1}{3} \\ -\frac{2}{3} \\ -\frac{2}{3} \end{array}\right] & \left[\begin{array}{c} 0 \\ -\frac{2}{3} \\ -\frac{2}{3} \\ \frac{1}{3} \end{array}\right] & \left[\begin{array}{c} 0 \\ -\frac{2}{3} \\ \frac{1}{3} \\ -\frac{2}{3} \end{array}\right] \\ \left[\begin{array}{c} 0 \\ 1 \\ 0 \\ 0 \end{array}\right] & \left[\begin{array}{c} \frac{1}{3} \\ 0 \\ -\frac{2}{3} \\ \frac{2}{3} \end{array}\right] & \left[\begin{array}{c} -\frac{2}{3} \\ 0 \\ \frac{1}{3} \\ \frac{2}{3} \end{array}\right] & \left[\begin{array}{c} \frac{2}{3} \\ 0 \\ \frac{2}{3} \\ \frac{1}{3} \end{array}\right] \\ \left[\begin{array}{c} 0 \\ 0 \\ 1 \\ 0 \end{array}\right] & \left[\begin{array}{c} -\frac{2}{3} \\ \frac{2}{3} \\ 0 \\ \frac{1}{3} \end{array}\right] & \left[\begin{array}{c} \frac{2}{3} \\ \frac{1}{3} \\ 0 \\ \frac{2}{3} \end{array}\right] & \left[\begin{array}{c} \frac{1}{3} \\ \frac{2}{3} \\ 0 \\ -\frac{2}{3} \end{array}\right] \\ \left[\begin{array}{c} 0 \\ 0 \\ 0 \\ 1 \end{array}\right] & \left[\begin{array}{c} \frac{2}{3} \\ \frac{2}{3} \\ \frac{1}{3} \\ 0 \end{array}\right] & \left[\begin{array}{c} \frac{1}{3} \\ -\frac{2}{3} \\ \frac{2}{3} \\ 0 \end{array}\right] & \left[\begin{array}{c} -\frac{2}{3} \\ \frac{1}{3} \\ \frac{2}{3} \\ 0 \end{array}\right] \end{array}\right]$$

I have been trying to find an answer at $n=5$ and it is proving wicked. I have been using 6th roots of unity and 24th roots of unity but it seems a pure wallpaper effort: I can get three rows down, but then maybe $a_{1,4}a_{3,4}\neq 0$... it seems a wicked problem and I think I will give up soon.

JP McCarthy
  • 8,209