Now that you've gotten answers for $\mathbb{R}^3$, I want to generalize.
Definition. A cross product $X(v_1,\cdots,v_k)$ is an alternating multilinear form on a real inner product space whose value is always orthogonal to its arguments and whose magnitude equals the unsigned volume of the parallelotope generated by $v_1,\cdots,v_k$.
This last condition can be described algebraically using the Gramian determinant:
$$ \|X(v_1,\cdots,v_k)\|^2=\det(V^TV), \quad V=\begin{bmatrix} | & & | \\ v_1 & \cdots & v_k \\ | & & |\end{bmatrix} $$
Note $V^TV$ is the matrix $[v_i\cdot v_j]$ of dot products. IOW, $\|X(v_1,\cdots,v_k)\|=\|v_1\wedge\cdots\wedge v_k\|$. Also note there is no assumption $k$ is the dimension of the inner product space, so $V$ isn't generally square. In the case of $k=2$ this is the Lagrange identity $\|u\times v\|^2=\|u\|^2\|v\|^2-(u\cdot v)^2$, which is also consistent with the trigonometric relations $u\cdot v=\|u\|\|v\|\cos\theta$ and $\|u\times v\|=\|u\|\|v\|\sin\theta$ (where $\theta=\angle uv$). Another note, polarizing this version of Lagrange yields Binet-Cauchy, $(a\times b)\cdot(c\times d)=(a\cdot c)(b\cdot d)-(a\cdot d)(c\cdot b)$, which suggests we may be able to generalize to $X(u_1,\cdots,u_k)\cdot X(v_1,\cdots,v_k)=\det(U^TV)$ (where $U=[u_i],V=[v_i]$), but this doesn't always work outside of 3D.
Lemma. If $v_1,\cdots,v_{k-1}$ are fixed and orthonormal then $X(v_1,\cdots,v_{k-1},v)$ as a linear operator of $v$ acts as zero on $\mathrm{span}\{v_1,\cdots,v_{k-1}\}$ and on the orthogonal complement as a compatible complex structure.
Recall a complex structure $J$ is a linear operator satisfiying $J^2=-I$, and is compatible with a real inner product if it is orthogonal, or equivalently skew-symmetric. It is so-called because it "looks" like multiplying a complex inner product space by the scalar $i$ if we view it as a real inner product space.
Proof. Define $J(v)=X(v_1,\cdots,v_{k-1},v)$. It is $0$ on $\mathrm{span}\{v_1,\cdots,v_{k-1}\}$ because of the condition $X$ is alternating, so let's restrict $J$ to the orthogonal complement. Now $\|Jv\|=\|v\|$ for all $v$, so $J$ is orthogonal. And $J(v)\perp v$ for all $v$, so $J$ is skew-symmetric (polarize the condition $Jv\cdot v=0$, i.e. replace $v$ with $u_1+u_2$ and see what happens). Then $J^T=-J=J^{-1}$ implies $J^2=-I$ and the lemma is proved.
Note for $k=2$, this effectively yields the version of Lagrange's given in Theo's answer. If $v_1,\cdots,v_{k-1}$ are any linearly independent vectors, then we can apply the Gram-Schmidt process for an orthonormal set $\{\hat{v}_1,\cdots,\hat{v}_k\}$ with the same span (for each $j\ge1$ subtract from $v_j$ its projections onto $\hat{v}_1,\cdots,\hat{v}_{j-1}$ then normalize), then $X(v_1,\cdots,v_{k-1},v)=\|v_1\wedge\cdots\wedge v_{k-1}\|X(\hat{v}_1,\cdots,\hat{v}_{k-1},v)$.
Corollary. If $v_1,\cdots,v_{k-1}$ are linearly independent and orthogonal to $v$, then there exists a unique solution $v_k$ to the equation $X(v_1,\cdots,v_k)=v$.
Proof. Applying $X(v_1,\cdots,v_{k-1},-)$ twice to $v_k$ yields $-\|v_1\wedge\cdots\wedge v_{k-1}\|^2v_k$ on the one hand and $X(v_1,\cdots,v_{k-1},v)$ on the other, allowing us to solve for $v_k$.
Definition. A symmetry or automorphism $R$ of $X$ is a linear operator satisfying $$ X(Rv_1,\cdots,Rv_k)=RX(v_1,\cdots,v_k) $$ identically. (We do not assume $R$ is orthogonal here.)
Theorem. Any automorphism of a cross product is orthogonal.
Proof. Pick any orthogonal $v_1,v\ne0$, then pick $v_2,\cdots,v_k$ so that $X(v_1,\cdots,v_k)=v$. Applying $R$ yields $X(Rv_1,\cdots,Rv_k)=Rv$, so $Rv_1\perp Rv$. Thus $R$ preserves orthogonality, which the linked answer in Theo's answer shows implies $R$ is orthogonal.