I have 2 set of points: $p_i$ and $q_i$ and I need to find and affine transformation that maps each point $p_i$ onto $q_i$. However there is a constraint that angles between unit vectors should be preserved, i.e. the transformation can be factored into scaling and rotation. Number of points is greater than $N$, so the problem is overdetermined, and I use least-squares formulation:
$$ \min_{R,S} \sum \|R S p_i - q_i\|^2 , \quad R \in \operatorname{SO}(N), S \in \operatorname{diag}(\mathbb{R}^N)$$
The problem is similar to How to solve an overdetermined system of point mappings via rotation and translation, except it doesn't take scaling into account. Following the answer to that question I came up with the following: $$ \begin{align} \|R S p_i-q_i\|^2&=(R S p_i-q_i)^\top(R S p_i-q_i) \\ &=p_i^\top S R^\top R S p_i+q_i^\top q_i-2q_i^\top R S p_i \\ &= p_i^\top S^2 p_i+q_i^\top q_i-2q_i^\top R S p_i\ \end{align} $$ $$ \def\argmin{\operatorname{argmin}} \argmin_{R,S}\sum \|R S p_i - q_i\|^2=\argmin_{R,S}\sum_i p_i^\top S^2 p_i-2q_i^\top R S p_i $$ $$ \def\tr{\operatorname{tr}} \begin{align} \sum_i p_i^\top S^2 p_i-2q_i^\top R S p_i &=\sum_i \tr p_i^\top S^2 p_i-2\tr q_i^\top R S p_i \\ &=\sum_i \tr p_i p_i^\top S^2 -2 \tr p_iq_i^\top R S \\ &=\tr\left(\left(\sum_ip_ip_i^\top\right)S^2 + \left(-2\sum_ip_iq_i^\top\right) R S\right) \\ &=:\tr C S^2+B R S \end{align} $$ This helped to simplify the problem to $N\times N$ matrices, so the number of points doesn't matter now. However I'm stuck at this point and can't figure out, how this expression can be minimized w.r.t $R$ and $S$.
I was able to solve this problem for 2-dimensional points by writing out matrices R and S in terms of their elements and then solving minimization problem by the method of Lagrange multipliers.
$$ R=\begin{pmatrix} r_1 & -r_2 \\ r_2 & r_1 \\ \end{pmatrix}, S=\begin{pmatrix} s_x & 0 \\ 0 & s_y \end{pmatrix} $$ Then the problem becomes to minimize $$ \tr{ \begin{pmatrix} c_{11} & c_{12} \\ c_{21} & c_{22} \\ \end{pmatrix} \begin{pmatrix} s_x^2 & 0 \\ 0 & s_y^2 \\ \end{pmatrix}+ \begin{pmatrix} b_{11} & b_{12} \\ b_{21} & b_{22} \\ \end{pmatrix} \begin{pmatrix} r_1 & -r_2 \\ r_2 & r_1 \\ \end{pmatrix} \begin{pmatrix} s_x & 0 \\ 0 & s_y \\ \end{pmatrix} } $$ with the constraint $r_1^2+r_2^2=1$. Solving it resulted in $$ k=\frac{b_{11}^2c_{22}+b_{22}^2c_{11}-b_{12}^2c_{22}-b_{21}^2c_{11}}{b_{12}b_{11}c_{22}-b_{21}b_{22}c_{11}}\\ r_1^2=\frac{2}{k^2+4\mp k\sqrt{k^2+4}}\\ r_2^2=\frac{2}{k^2+4\pm k\sqrt{k^2+4}}\\ s_x=-\frac{b_{11}}{2c_{11}}r_1-\frac{b_{12}}{2c_{11}}r_2\\ s_y=\frac{b_{21}}{2c_{22}}r_2-\frac{b_{22}}{2c_{22}}r_1 $$
However I'm not sure that I made no mistakes and I don't see any way to generalize this method to more than 2 dimensions.
I have also tried to find the transformation using pseudoinverses, but in that case I was able to get only affine transformations, and failed to add constraints, because they happen to be non-linear.