1

Theorem

The rotation by $\theta$ function $T:\mathbb{R}^2 \to \mathbb{R}^2$ defined for $\vec{x}%=\begin{bmatrix}x_1 \\ x_2\end{bmatrix} =\begin{bmatrix}r\cos(\alpha) \\ r\sin(\alpha)\end{bmatrix}$ by \begin{align*} T(\vec{x}) = \begin{bmatrix} r\cos(\alpha+\theta) \\ r\sin(\alpha+\theta) \end{bmatrix} \end{align*} is a linear function.

Proof

  1. First we show $T(c\vec{a}) = cT(\vec{a})$:

    \begin{align*} T(c \vec{a}) &=T\left(c \begin{bmatrix} r \cos(\alpha) \\ r \sin(\alpha) \end{bmatrix}\right) \\ &=T(\begin{bmatrix} cr \cos(\alpha) \\ cr \sin(\alpha) \end{bmatrix}) \\ &=\begin{bmatrix} cr \cos(\alpha+\theta) \\ cr \sin(\alpha+\theta) \end{bmatrix} \\ &=c\begin{bmatrix} r \cos(\alpha+\theta) \\ r \sin(\alpha+\theta) \end{bmatrix} \\ &=c T\left(\begin{bmatrix} r \cos (\alpha) \\ r \sin (\alpha) \end{bmatrix}\right) \\ &=c T(\vec{a}) \end{align*}

  2. Next, we show $T(\vec{a}+\vec{b}) = T(\vec{a})+T(\vec{b})$:

    \begin{align} T \left(\vec{a}+\vec{b}\right) &= T\left(\begin{bmatrix} a_1 \\ a_2 \end{bmatrix} +\begin{bmatrix} b_1 \\ b_2 \end{bmatrix}\right)\\ &= T\left(\begin{bmatrix} r \cos(\alpha) \\ r \sin(\alpha) \end{bmatrix} +\begin{bmatrix} s \cos(\beta) \\ s \sin (\beta) \end{bmatrix}\right)\\ &=T\left(\begin{bmatrix} r \cos(\alpha) + s \cos(\beta) \\ r \sin(\alpha) + s \sin (\beta) \end{bmatrix}\right)\\ & \ \ \vdots \\ & \ \ \vdots \\ &=\begin{bmatrix} r \cos(\alpha + \theta) \\ r \sin (\alpha + \theta) \end{bmatrix} + \begin{bmatrix} s \cos (\beta + \theta) \\ s \sin (\beta + \theta) \end{bmatrix}\\ &=T\left(\begin{bmatrix} r \cos(\alpha) \\ r \sin (\alpha) \end{bmatrix}\right) + T\left(\begin{bmatrix} s \cos (\beta) \\ s \sin (\beta) \end{bmatrix}\right)\\ &=T(\vec{a})+T(\vec{b}) \end{align}


I was able to successfully prove part (1), but am having trouble with part (2). I have checked some resources, including

in order to combine

\begin{align*} &r \cos(\alpha) + s \cos(\beta), \\ &r \sin(\alpha) + s \sin(\beta) \end{align*}

Could anyone help flesh out the missing steps of this proof?

Note I know this transformation can be expressed as multiplication by the matrix $\begin{bmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta)\end{bmatrix}$, and all matrix multiplications are linear, but I would like to avoid this approach and show the missing steps directly using trig identities.

EthanAlvaree
  • 3,518
  • 3
  • 31
  • 65

2 Answers2

3

Find $t$ and $\gamma$ such that $$ r\cos\alpha + s\cos\beta=t\cos\gamma\tag1 $$ and $$ r\sin\alpha + s\sin\beta = t\sin\gamma.\tag2 $$ (Why is this possible? Divide equation (2) by (1) to get a formula for $\tan \gamma$, then deduce $t$ using either equation. This requires us to exclude edge cases that cause division by zero.)

Then $$T\left(\begin{bmatrix} r \cos(\alpha) + s \cos(\beta) \\ r \sin(\alpha) + s \sin (\beta) \end{bmatrix}\right)=T\left(\begin{bmatrix}t\cos\gamma\\t\sin\gamma\end{bmatrix}\right)= \begin{bmatrix}t\cos(\gamma+\theta)\\t\sin(\gamma+\theta)\end{bmatrix} $$ Now expand out $t\cos(\gamma+\theta)$: $$ \begin{aligned} t\cos(\gamma+\theta)&= t\cos\gamma\cos\theta-t\sin\gamma\sin\theta\\ &=(r\cos\alpha+s\cos\beta)\cos\theta - (r\sin\alpha+s\sin\beta)\sin\theta\\ &=r(\cos\alpha\cos\theta-\sin\alpha\sin\theta) +s(\cos\beta\cos\theta - \sin\beta\sin\theta)\\ &=r\cos(\alpha+\theta) + s\cos(\beta+\theta) \end{aligned} $$ The calculation for $t\sin(\gamma+\theta)$ is similar.

grand_chat
  • 40,909
  • Thanks very much for your reply! But how would one compute $t$ and $\gamma$? Is there a formula for these constants in terms of $r$, $s$, $\alpha$, and $\beta$? Thanks again. – EthanAlvaree Mar 23 '23 at 07:18
  • The ratio of (2) to (1) gives $\tan \gamma = \frac{r\sin\alpha + s\sin\beta}{r\cos\alpha + s\cos\beta}$ so $\gamma=\arctan\left(\frac{r\sin\alpha + s\sin\beta}{r\cos\alpha + s\cos\beta}\right)$. Plug $\gamma$ into (1) to obtain $t$. In order for the proof to proceed, it is sufficient to show that $\gamma$ and $t$ exist. – grand_chat Mar 23 '23 at 14:04
  • Sorry for still not seeing it, but why is it the case that equations (1) and (2) are true? Is this a trig identity? How would one go about calculating $t$ and $\gamma$ - what's the formula for these constants? – EthanAlvaree Mar 24 '23 at 17:41
  • Equations (1) and (2) are equations that $t$ and $\gamma$ must satisfy. They are essentially defining $t$ and $\gamma$. For any given $r, s, \alpha, \beta$ it is possible to find $t$ and $\gamma$ that satisfy these conditions, by following the recipe in my first comment – grand_chat Mar 24 '23 at 17:46
  • I am trying to be convinced that the linear combination of cosines is another cosine, and that the linear combination of sines is another sine. So far I am not seeing any proof of that. – EthanAlvaree Mar 24 '23 at 19:37
  • The assertion is not that the linear combination of cosines is another cosine--there is a factor of $t$ as well. If you have calculated $t$ and $\gamma$ according to the above recipe, you can confirm that equations (1) and (2) are now true. – grand_chat Mar 24 '23 at 19:51
  • I think I understand. We are not claiming a linear combination of cosine waves is a scaled up cosine wave using a Fourier Series argument or something like that, right? We're just saying given fixed angles $\alpha$ and $\beta$, we can find another angle $\gamma$ (and an appropriate scalar $t$) to make equation (1) true. – EthanAlvaree Mar 24 '23 at 20:05
  • We would have to choose $\gamma=\arctan\left(\frac{r \sin \alpha + s \sin \beta}{r \cos \alpha + s \cos \beta}\right)$ and $t=\frac{r \cos \alpha + s \cos \beta}{\cos \gamma} = \frac{r \cos \alpha + s \cos \beta}{\cos \left(\arctan\left(\frac{r \sin \alpha + s \sin \beta}{r \cos \alpha + s \cos \beta}\right)\right)}$. Therefore $\gamma$ and $t$ are uniquely defined in terms of the constants $\alpha$, $\beta$, $r$, and $s$, is that correct? – EthanAlvaree Mar 24 '23 at 20:06
  • Correct, the values $t$ and $\gamma$ depend on the current values for $r,s,\alpha,\beta$; equations (1) and (2) are not identities in the sense that $\sin 2\theta = 2 \sin\theta\cos\theta$ is an identity. – grand_chat Mar 24 '23 at 20:07
0

I am aware that you've mentioned that you prefer to avoid using the matrix form of the rotation. Still, I wanted at least to point out that the matrix form can be derived using trigonometric identities for sum of angles (since you seem to prefer trigonometry).

If the vector $\vec x$ is given by \begin{align*} x_1&=r\cos\alpha\\ x_2&=r\sin\alpha \end{align*} then you have \begin{align*} y_1&=r\cos(\alpha+\theta)=r(\cos\alpha\cos\theta-\sin\alpha\sin\theta)=x_1\cos\theta-x_2\sin\theta\\ y_2&=r\sin(\alpha+\theta)=r(\cos\alpha\sin\theta+\sin\alpha\cos\theta)=x_1\sin\theta+x_2\cos\theta \end{align*} which gives you the expression in the matrix form. (And in this form, it is easy to verify linearity.)