3

I found the recipe below to work for generating random rotation matrices in n dimensions. (straightforward modification from CUE sampling recipe from page 11 of https://arxiv.org/abs/math-ph/0609050). Is there a similarly efficient algorithm to generate small rotations? (ie, I want the vectors before and after transformation to be close in some sense)

randomRotation[n_]:=Module[{M,z,q,r,d,ph},
z=RandomVariate[NormalDistribution[0,1],{n,n}];
{q,r}=QRDecomposition[z];
d=Diagonal[r];
ph=d/Abs[d];
M=q*ph;
(* Note, determinant may be -1 giving "pseudo-rotation", switch 2 rows of the matrix to guarantee true rotation *)
indices=If[Det[M]>0,Range[n],{2,1}~Join~Range[3,n]];
M[[indices]]
];
  • Word of warning: this algorithm due to Stewart may generate "pseudo-rotation matrices" with determinant=-1, i.e. it samples from the O(n) group, not the SO(n) group. You need to project to SO(n). See https://mathoverflow.net/a/338157 – András Aszódi Sep 14 '20 at 15:35
  • @LaryxDecidua hm, I've been using this function to rotate my datasets, and the distribution of dataset under random "pseudo-rotation" looks indistinguishable from "random rotation", last diagram of https://www.wolframcloud.com/obj/yaroslavvb/exp/random-rotations.nb . What is the interpretation of det=-1 here? – Yaroslav Bulatov Sep 14 '20 at 15:51
  • det=-1 means that the matrix not only rotates, but also reflects through the origin. Elements of SO(n) with det=+1 only rotate. See e.g. https://en.wikipedia.org/wiki/Orthogonal_group#Reflections – András Aszódi Sep 14 '20 at 15:59
  • thanks for the note, updated the code to give true rotations – Yaroslav Bulatov Sep 14 '20 at 16:03

2 Answers2

5

You obtain Haar-uniform random rotations if you perform QR decomposition on an initial matrix which has independent and identically gaussian distributed random elements.

If you want random small rotations from QR, you could perform it on initial matrices which consist of random small perturbations of the identity.

Marcel
  • 1,613
2

If you mean by "nD rotation" an orthogonal transformation with unit determinant, here is a simple recipe with a low computational complexity.

a) Principle: obtain it by a composition of two hyperplane reflections (aka symmetries) with rather close normal vectors (see remark 1).

b) Matrix version: You probably know that reflection with respect to an hyperplane with unit normal (column) vector $N$ is rendered by a Householder matrix $I_n-2NN^T$ (https://en.wikipedia.org/wiki/Householder_transformation). Thus, the rotation matrix is $$(I_n-2N_1N_1^T)(I_n-2N_2N_2^T)$$ where $N_2$ for example is a perturbation of $N_1$.

Remark 1: As remarked by @Rahul, this leaves a $(n-2)$ hyperplane invariant. In order to obviate this lack of generality, it suffices to apply again and again transformations $$(I_-2N_{2k+1}N_{2k+1}^T)(I_n-2N_{2k+2}N_{2k+2}^T)$$ for $k=1\cdots [n/2]$.

Remark 2: Principle a) generalizes a well known property of planar geometry (Composition of two reflections is a rotation): the composition of a reflection with respect to line $y=\tan(\alpha)x$, with a reflection with respect to line $y=\tan(\beta)x$ is a rotation with angle $2(\beta-\alpha)$.

Remark 3: in 3D, one can also consider Rodrigues' formula (https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula).

Remark 4: Vector $N$ is an eigenvector associated to the Householder matrix with eigenvalue $-1$.

Jean Marie
  • 88,997
  • This will only generate rotations that leave an $(n-2)$-dimensional subspace unchanged, namely the intersection of the two chosen hyperplanes. –  Apr 06 '17 at 23:00
  • Good Remark. I should have said that, in order to have a "most general" determinant 1 isometry, one has to repeat the process (multiply again by $(I_n -2N_3N_3^T)(I_n-2N_4N_4^T) $ etc. The superiority of this method remains à low computational cost. – Jean Marie Apr 07 '17 at 07:20
  • Besides, @Rahul, keep your downvoting to real problems. I consider that I have brought a rather precise answer to the OP. – Jean Marie Apr 08 '17 at 07:24
  • I consider this comparable to if the question had asked for generating a random vector and you provided an answer that only generated 1-sparse vectors. But given the updated answer, I have removed my downvote. –  Apr 08 '17 at 15:13
  • @Rahul I appreciate your fairness. – Jean Marie Apr 08 '17 at 19:58
  • @JeanMarie thanks, that's useful info. For random matrices the QR approach seemed easiest given existing code, but I'm reusing your approach for generating fixed small rotations for testing – Yaroslav Bulatov Apr 09 '17 at 01:11