6

Let us just focus on $\mathbb{R}^3$ currently. We study the set of all $3\times 3$ matrices $R$ satisfying $$ R(v\times w) = (Rv) \times (Rw), \forall v,w \in \mathbb{R}^3 $$ where $\times$ is the cross product.

It is known that this condition is satisfied when $R \in \text{SO}(3)$. However, is there any matrix other than the ones in $\text{SO}(3)$ that satisfies this condition?

Sebastiano
  • 8,290
Mr. Egg
  • 754
  • Were you planning to ask the same question about higher dimensions? I have shown $R\in O(n)$ but don't yet see how to prove $\det R=1$. – anon Nov 23 '22 at 05:17

3 Answers3

6

Yes, there is one other: the zero map.

First, observe that the map $v \mapsto Rv$ must preserve orthogonality. That is, given $v, w \in \Bbb{R}^3$, $$Rv \cdot Rw = 0 \iff v \cdot w = 0.$$ Obviously, if $v = 0$ or $w = 0$, this is trivially true, so we assume $v \neq 0 \neq w$. By Lagrange's identity, $$\frac{v \times w}{\|v\|^2} \times v = w - \frac{v \cdot w}{\|v\|^2}v,$$ hence $$\frac{Rv \times Rw}{\|v\|^2} \times Rv = Rw - \frac{v \cdot w}{\|v\|^2}Rv$$

If $v \cdot w = 0$, then the right hand side is $Rw$, and since it is the cross product of $Rv$ with another vector, we have $Rv \cdot Rw = 0$. Conversely, if $Rv \cdot Rw = 0$, then taking the dot product of both sides with respect to $Rv$, we get $$0 = 0 - \frac{v \cdot w}{\|v\|^2}\|Rv\|^2.$$ Thus, we have either $Rv = 0$ or $v \cdot w = 0$. If we have the latter, we are done. So, let's assume we have the former.

Since $Rv = 0$, we know that $Rv \times Rx = 0$ for all $x \in \Bbb{R}^3$. This implies that $R(v \times x) = 0$ for all $x \in \Bbb{R}^3$. The image of $x \mapsto v \times x$ is the entirety of $\{v\}^\perp$, hence $\{v\}^\perp \subseteq \operatorname{null} R$. But, $v \in \operatorname{null} R$, hence all of $\Bbb{R}^3$ lies in $\operatorname{null} R$. That is, $R = 0$, in which case $Rv \cdot Rw = 0$ and $v \cdot w = 0$ hold generally.

Now, we observe that such a matrix must be a scalar multiple of an orthogonal matrix. So, $R = \lambda O$ for some $O \in \operatorname{O}(3)$ and $\lambda \in \Bbb{R}$. Since we are in odd dimensions, we need only consider $O \in \operatorname{SO}(3)$, as we can replace $\lambda$ with $-\lambda$ to negate the determinant.

As you noted, $O$ will preserve the cross product. Thus, $$\lambda^2O(v \times w) = \lambda^2(Ov \times Ow) = Rv \times Rw = R(v \times w) = \lambda O(v \times w)$$ for all $v, w$. If we pick $v, w$ so that $v \times w \neq 0$, then since $O$ is invertible, $O(v \times w) \neq 0$ too. Thus, $\lambda = \lambda^2$, hence $\lambda = 0$ or $\lambda = 1$.

Therefore, there is only one single map that you missed: the zero map (when $\lambda = 0$). Otherwise, the matrix must belong to $\operatorname{SO}(3)$.

Theo Bendit
  • 53,568
4

$R$ is either zero or a member of $SO(3)$.

Let $\{u,v,w\}$ be an orthonormal basis of $\mathbb R^3$ with positive orientation. Then $$ \begin{aligned} Ru&=R(v\times w)=Rv\times Rw,\\ Rv&=R(w\times u)=Rw\times Ru,\\ Rw&=R(u\times v)=Ru\times Rv.\\ \end{aligned} $$ If one of $Ru,Rv$ or $Rw$ is zero, the rest of them must also be zero and hence $R=0$. If all of them are nonzero, the three equations above imply that $Ru,Rv$ and $Rw$ are mutually orthogonal. Hence the equations also imply that $\|Ru\|=\|Rv\|\|Rw\|,\,\|Rv\|=\|Rw\|\|Ru\|$ and $\|Rw\|=\|Ru\|\|Rv\|$. It follows that $Ru,Rv$ and $Rw$ are unit vectors. In turn, $\{Ru,Rv,Rw\}$ is an orthonormal set. Therefore $R$ is an orthogonal matrix. But then $$ \begin{aligned} \det R&=\det(R)\det\pmatrix{u&v&w}\\ &=\det\pmatrix{Ru&Rv&Rw}\\ &=Ru\cdot(Rv\times Rw)\\ &=Ru\cdot Ru\\ &=u\cdot u=1. \end{aligned} $$ Therefore $R\in SO(3)$.

user1551
  • 149,263
2

Now that you've gotten answers for $\mathbb{R}^3$, I want to generalize.

Definition. A cross product $X(v_1,\cdots,v_k)$ is an alternating multilinear form on a real inner product space whose value is always orthogonal to its arguments and whose magnitude equals the unsigned volume of the parallelotope generated by $v_1,\cdots,v_k$.

This last condition can be described algebraically using the Gramian determinant:

$$ \|X(v_1,\cdots,v_k)\|^2=\det(V^TV), \quad V=\begin{bmatrix} | & & | \\ v_1 & \cdots & v_k \\ | & & |\end{bmatrix} $$

Note $V^TV$ is the matrix $[v_i\cdot v_j]$ of dot products. IOW, $\|X(v_1,\cdots,v_k)\|=\|v_1\wedge\cdots\wedge v_k\|$. Also note there is no assumption $k$ is the dimension of the inner product space, so $V$ isn't generally square. In the case of $k=2$ this is the Lagrange identity $\|u\times v\|^2=\|u\|^2\|v\|^2-(u\cdot v)^2$, which is also consistent with the trigonometric relations $u\cdot v=\|u\|\|v\|\cos\theta$ and $\|u\times v\|=\|u\|\|v\|\sin\theta$ (where $\theta=\angle uv$). Another note, polarizing this version of Lagrange yields Binet-Cauchy, $(a\times b)\cdot(c\times d)=(a\cdot c)(b\cdot d)-(a\cdot d)(c\cdot b)$, which suggests we may be able to generalize to $X(u_1,\cdots,u_k)\cdot X(v_1,\cdots,v_k)=\det(U^TV)$ (where $U=[u_i],V=[v_i]$), but this doesn't always work outside of 3D.

Lemma. If $v_1,\cdots,v_{k-1}$ are fixed and orthonormal then $X(v_1,\cdots,v_{k-1},v)$ as a linear operator of $v$ acts as zero on $\mathrm{span}\{v_1,\cdots,v_{k-1}\}$ and on the orthogonal complement as a compatible complex structure.

Recall a complex structure $J$ is a linear operator satisfiying $J^2=-I$, and is compatible with a real inner product if it is orthogonal, or equivalently skew-symmetric. It is so-called because it "looks" like multiplying a complex inner product space by the scalar $i$ if we view it as a real inner product space.

Proof. Define $J(v)=X(v_1,\cdots,v_{k-1},v)$. It is $0$ on $\mathrm{span}\{v_1,\cdots,v_{k-1}\}$ because of the condition $X$ is alternating, so let's restrict $J$ to the orthogonal complement. Now $\|Jv\|=\|v\|$ for all $v$, so $J$ is orthogonal. And $J(v)\perp v$ for all $v$, so $J$ is skew-symmetric (polarize the condition $Jv\cdot v=0$, i.e. replace $v$ with $u_1+u_2$ and see what happens). Then $J^T=-J=J^{-1}$ implies $J^2=-I$ and the lemma is proved.

Note for $k=2$, this effectively yields the version of Lagrange's given in Theo's answer. If $v_1,\cdots,v_{k-1}$ are any linearly independent vectors, then we can apply the Gram-Schmidt process for an orthonormal set $\{\hat{v}_1,\cdots,\hat{v}_k\}$ with the same span (for each $j\ge1$ subtract from $v_j$ its projections onto $\hat{v}_1,\cdots,\hat{v}_{j-1}$ then normalize), then $X(v_1,\cdots,v_{k-1},v)=\|v_1\wedge\cdots\wedge v_{k-1}\|X(\hat{v}_1,\cdots,\hat{v}_{k-1},v)$.

Corollary. If $v_1,\cdots,v_{k-1}$ are linearly independent and orthogonal to $v$, then there exists a unique solution $v_k$ to the equation $X(v_1,\cdots,v_k)=v$.

Proof. Applying $X(v_1,\cdots,v_{k-1},-)$ twice to $v_k$ yields $-\|v_1\wedge\cdots\wedge v_{k-1}\|^2v_k$ on the one hand and $X(v_1,\cdots,v_{k-1},v)$ on the other, allowing us to solve for $v_k$.

Definition. A symmetry or automorphism $R$ of $X$ is a linear operator satisfying $$ X(Rv_1,\cdots,Rv_k)=RX(v_1,\cdots,v_k) $$ identically. (We do not assume $R$ is orthogonal here.)

Theorem. Any automorphism of a cross product is orthogonal.

Proof. Pick any orthogonal $v_1,v\ne0$, then pick $v_2,\cdots,v_k$ so that $X(v_1,\cdots,v_k)=v$. Applying $R$ yields $X(Rv_1,\cdots,Rv_k)=Rv$, so $Rv_1\perp Rv$. Thus $R$ preserves orthogonality, which the linked answer in Theo's answer shows implies $R$ is orthogonal.

anon
  • 155,259