0

I've been playing with the definition of the cross product and am trying to grasp the atomic algebraic assumptions needed to define the unique cross product. I remember seeing a post that was saying that the cross product is necessarily the only vector valued vector multiplication that satisfies the distributive property with scalar multiplication and addition (1)-(2) below and some other simple assumptions. One such requirement would be that the product is orthogonal to both arguments (4) below. In trying to find these assumptions I have arrived at the following investigation.

The cross product is defined as an operation $\times : \mathbf{R^3}\times\mathbf{R^3}\rightarrow\mathbf{R^3}$ with the following algebraic properties.

(1) $c\mathbf{v}\times\mathbf{w} = \mathbf{v}\times c\mathbf{w} = c(\mathbf{v}\times\mathbf{w})$

(2a) $(\mathbf{v} + \mathbf{u})\times \mathbf{w} = \mathbf{v}\times \mathbf{w} + \mathbf{u} + \mathbf{w}$

(2b) $\mathbf{v} \times (\mathbf{u} + \mathbf{w}) = \mathbf{v}\times \mathbf{u} + \mathbf{v} + \mathbf{w}$

(3) $\mathbf{v} \times \mathbf{w} = -(\mathbf{w} \times \mathbf{v})$.

With these properties along with the assumptions

(i) $\hat{i}\times \hat{j} = \hat{k}$

(ii) $\hat{j}\times \hat{k} = \hat{i}$

(iii) $\hat{k}\times \hat{i} = \hat{j}$

we can derive the definition for such a product by computing $\mathbf{v} \times \mathbf{w} = (v_1 \hat{i} + v_2 \hat{j} + v_3 \hat{k}) \times (w_1 \hat{i} + w_2 \hat{j} + w_3 \hat{k})$.

My question is, is it possible to replace rule (3) with $\mathbf{v} \times \mathbf{w} = \mathbf{w} \times \mathbf{v}$ and assume only (i) $\hat{i}\times \hat{j} = \hat{k}$ to define a consistent multiplication? It seems like this shouldn't work but I haven't been able to find a contradiction yet.

An alternate question is: can we assume (1)-(2) with (i) along with

(4) $\mathbf{v} \cdot(\mathbf{v}\times \mathbf{w}) = \mathbf{w} \cdot(\mathbf{v}\times \mathbf{w}) =\mathbf{0}$

and derive (3) by contradiction?

BENG
  • 1,239
  • I don't think so... For $\vec v\times\vec w=\vec w\times\vec v$, we need to make both products point in the same direction. An idea is that we can just "block" half of the 3D space, but that doesn't seem like a good definition Also, since $\hat i\times\hat j=\hat k$ and $\hat j\times\hat i\neq\hat k$, isn't that a contradiction already? Perhaps we'll need to make $\hat i\times\hat j=\hat j\times\hat i=\hat k$. – TheMather - or rather AMather May 24 '23 at 03:38
  • By the way, here’s what I think is the most illuminating way to think about the cross product. Let $b, c \in \mathbb R^3$. The function $L(a) = \det \begin{bmatrix} a & b & c \end{bmatrix}$ is linear, so there exists a vector $v$ such that $L(a) = \langle a, v \rangle$ for all $a \in \mathbb R^3$. This vector $v$ is called the “cross product” of $b$ and $c$. The geometric interpretation of $v$ now follows (with some thought) from the geometric interpretations of the determinant and the dot product. – littleO May 24 '23 at 04:01

1 Answers1

1

No, if you require $v\times w=w\times v$, then even specifying (i)-(iii) alone is not enough to completely define the multiplication. So, if you specify (i) alone, then you definitely have too little.

Your properties (1),(2a),(2b) together say that $\times$ is a bilinear map $\Bbb{R}^3\times\Bbb{R}^3\to\Bbb{R}^3$. So, let us first understand linear and bilinear maps more carefully first. Say you have vector spaces $X,Y,Z$ over the same field $\Bbb{F}$ with $X$ and $Y$ having finite dimension, say $n,m$ respectively (in the above example, $\Bbb{F}=\Bbb{R}$ is the field of real numbers and $n=m=3$ and $X=Y=Z=\Bbb{R}^3$). Now, in order to completely specify a bilinear mapping $T:X\times Y\to Z$, it suffices to fix a basis $\alpha=\{v_1,\dots, v_n\}$ for $V$ and a basis $\beta=\{w_1,\dots, w_m\}$ for $W$, and to specify $T(v_i,w_j)$ for all $i,j$ (see here for the proof of this claim in the case of linear maps; I leave it to you to prove the analogue for bilinear, and more generally, multilinear maps).

So, in order to fully specify a bilinear map $T:X\times Y\to Z$, you have to specify a total of $nm$ pieces of information (i.e the values $T(v_i,w_j)\in Z$ for all $i,j$).

Now, let us come to the special case where $X=Y$, so we’re interested in bilinear maps $T:X\times X\to Z$. Then of course one has to specify a total of $n^2$ pieces of information. There are two special cases of interest:

  • anti-symmetric $T$ (assuming further that $\Bbb{F}$ doesn’t have characteristic $2$, which is certainly the case for $\Bbb{R}$):, i.e a bilinear $T$ such that for all $x,y\in X$, $T(x,y)=-T(y,x)$. In this case, you only have to specify $\frac{n^2-n}{2}=\frac{n(n-1)}{2}$ pieces of information, namely $T(v_i,v_j)$ for all $i,j\in\{1,\dots, n\}$ such that $i<j$ (if you imagine storing the values $T(v_i,v_j)$ in an $n\times n$ matrix, then you have to specify the values strict upper triangle; from here, the entries on the main diagonal must be zero due to anti-symetry, and those on the strict lower triangle are the additive inverses of those on the upper triangle). In the case of the cross product, $n=3$, so $\frac{n(n-1)}{2}=3$, which is why you only needed to tell me what $i\times j, i\times k, j\times k$ are. From here I can figure out the rest (6 others) based on anti-symmetry.
  • symmetric $T$: here, you need to specify a total of $\frac{n^2-n}{2}+n=\frac{n(n+1)}{2}$ pieces of information, i.e $T(v_i,v_j)$ for all $i,j\in\{1,\dots, n\}$ such that $i\leq j$ (notice the difference the $<$ vs $\leq $ makes in how much you need to specify). In matrix language, you have to specify the values on the upper triangle and the main diagonal; then those on the strict lower triangle are known. So, in the case of $n=3$, you need to specify $\frac{n(n+1)}{2}=6$ pieces of information. However, properties (i)-(iii) only give us 3, so this is insufficient to get a unique “cross product” (there are infinitely many symmetric bilinear maps which satisfy (i)-(iii)).
peek-a-boo
  • 65,833
  • 1
    In short, there is a unique choice once you also specify $i \times i$, $j \times j$ and $k \times k$, which are forced to be $0$ for antisymmetric $T$ but are arbitrary for symmetric $T$. – ronno May 24 '23 at 12:22