27

With the Taylor series representation of $\sin$ or $\cos$ as a starting point (and assuming no other knowledge about those functions), how can one:

a. prove they are periodic?

b. find the value of the period?

jprete
  • 113
  • The proof's somewhere in chapter 2 of Ahlfors's Complex Analysis, but I don't have my copy with me to check. – J. M. ain't a mathematician Sep 09 '11 at 13:59
  • 5
    English aside: you don't know nothing else ... You mean we know nothing else or we don't know anything else . – GEdgar Sep 09 '11 at 16:31
  • @Pierre: GEdgar is correct that language purists will sneer at your double negative. But double negatives are a regular feature of many dialects of English, so don't be downhearted. GEdgar is wrong, however, in insisting on the first person plural ('we'). The second person ('you') is preferable here, because in fact we do know more about these functions. – TonyK Sep 09 '11 at 17:36
  • 1
    @Tony: in that case, "and without assuming anything else" might be a better choice of words, no? – J. M. ain't a mathematician Sep 09 '11 at 17:57
  • 4
    @Tony, linguistic peevery about double negatives is one thing -- but in mathematics we need to place greater demands on precision than ordinary language does. This is if for the pragmatic reason that we sometimes do have to utter a negated negation and have it understood as such. And, also in contrast to most nonmathematical conversation, we cannot rely on common sense to disambiguate, because we sometimes deliberately utter falsehoods for the purpose of proving things about these falsehoods. – hmakholm left over Monica Sep 09 '11 at 18:09
  • 1
    I think a proof of this is in Baby Rudin (i.e. Walter Rudin's Principles of Mathematical Analysis). Maybe in the "special functions" chapter. – Michael Hardy Sep 09 '11 at 18:32
  • @J.M.: Yes, your version is probably best. But OP's choice of pronoun was better than GEdgar's. – TonyK Sep 09 '11 at 18:34
  • This is done in the Prologue to Rudin Real and Complex Analysis. You can find a more detailed exposition of the same argument in the appendix on exp, sin and cos in Complex Made Simple. – David C. Ullrich Jan 05 '18 at 15:18

4 Answers4

20

A rough sketch for (a) could be

  1. The power series converge everywhere.
  2. $\sin(x)^2+\cos(x)^2=1$. (As noted below, it is simpler to do this after step 3.)
  3. $\frac{d}{dx}\sin(x)=\cos(x)$ and $\frac{d}{dx}\cos(x)=-\sin(x)$.
  4. The differential equation $\frac{d^2y}{dx^2} = -y$ determines $y$ for all $x$ if you know $y$ and $\frac{dy}{dx}$ at any one particular $x$.
  5. There is a smallest positive $\theta$ such that $\cos(\theta)=0$ and $\sin(\theta)=1$.
  6. $\cos(\theta+x)=-\sin(x)$ for this particular $\theta$.
  7. cos and sin both have period $4\theta$.

For part (b), you have to determine the period numerically in general. Of course the answer is $2\pi$, but proving this depends on what your definition of $\pi$ is. A popular definition is that $\pi$ is simply twice the smallest positive $\theta$ such that $\cos(\theta)=0$, in which case period $2\pi$ is just a tautology.

  • 1
    Proving 2. from the power series is a bit of work. – Mark Bennet Sep 09 '11 at 14:15
  • 7
    @Mark : I would suggest starting with proving 3 (straighforward), then differentiating $\cos^2+\sin^2$ to find that it is actually constant, and then evaluating at $0$. Another way (which I actually prefer) is to use the exponential to prove $|e^{it}| = 1$ for $t \in \mathbb{R}$, as Beni suggests. – Joel Cohen Sep 09 '11 at 14:28
  • 1
    @Joel: I like your suggestion, though, because it avoids going beyond the reals. – Mark Bennet Sep 09 '11 at 15:09
  • 2
    @Joel: I suggested the same thing for a similar problem. The periodicity follows from $\sin^2(x)+\cos^2(x)=1$ (insuring a closed path) and the flow equation $$\frac{\mathrm{d}}{\mathrm{d}x}\begin{pmatrix}\sin(x)\ \cos(x)\end{pmatrix}=\begin{pmatrix}0&1\-1&0\end{pmatrix}\begin{pmatrix}\sin(x)\ \cos(x)\end{pmatrix}$$ – robjohn Nov 09 '11 at 15:56
  • Idk why this took me so long, but if anyone else is unsure how this proof is finished, you first note that by $\cos(x) \neq 0$ on $[0, \theta)$ and $\cos(0) = 1$, we know $\cos(x) > 0$ on this range (otherwise intermediate value theorem would force $\cos(x) = 0$) so by the derivative relations we get that $\sin$ increases monotonically here, meaning $\cos$ decreases monotonically. Now that you know $\cos$ goes form 1 to 0 and $\sin$ from 0 to 1 monotonically on $[0, \theta]$, using $\cos(\theta + x) = -\sin(x)$ you can analyze the behavior on all of $[0, 4\theta]$ to find the exact period. – MathNeophyte Oct 05 '24 at 01:45
  • I'm assuming $\cos(x + \theta) = -\sin(x)$ comes from the sum of angles identity, which you prove either by showing both sides satisfy the same differential equation with the same initial conditions, or algebraically after using Euler's formula to get the exponential forms $\cos(z) = \frac{e^{iz}+e^{-iz}}{2}$ and $\sin(z) = \frac{e^{iz}- e^{-iz}}{2}$ (kinda similar to how you prove the Pythagorean identity here in either of the ways that Joel mentioned). – MathNeophyte Oct 05 '24 at 01:56
  • And the companion to $\cos(\theta + x) = -\sin(x)$ is of course $\sin(\theta + x) = \cos(x)$, which you also need. – MathNeophyte Oct 05 '24 at 01:58
  • So in the end you don't actually need step 3 or the rather algorithmic proof of the existence and uniqueness of differential equations for this problem, as long as you use the algebraic proofs of all the trig identities. Also, to argue the first positive zero exists before you know much about the signs/increasing or decreasing behavior it's probably best to use the argument Beni uses. – MathNeophyte Oct 05 '24 at 02:07
17

Some ideas can be found in Rudin's Real and Complex Analysis, first chapter about the exponential function.

The exponential is defined to be

$$e^t=1+t+\frac{t^2}{2!}+...$$

Using the Taylor representation for $\sin, \cos$ it is easy to see that $e^{it}=\cos t+i\sin t$, so to prove that $\cos, \sin$ are periodic it is enough to prove that $t\mapsto e^{it}$ is periodic, and for this is enough to prove that there exists $t_0>0$ with $e^{it_0}=1$. Such a $t_0$ is $2\pi$ (since $\sin 2\pi=0$ and $\cos 2\pi=1$) and we are done.


If we don't know anything about the existence of $\pi$ and the values of $\sin,\cos$ in $2\pi$, then we can proceed as follows:

$\cos 0=1$; this is obvious from the series. $\cos 2< 1-\frac{4}{2}+\frac{16}{24}=-\frac{1}{3}$ (the inequality is easy to prove). From the series we can see that $\cos$ is a continuous function and therefore there exists a smallest $t_0>0$ with $\cos t_0=0$. Define $\pi=2t_0$. $|e^{it}|=1$ for every $t$ since $e^{-it}=\overline{e^{it}}$ and $e^{a+b}=e^a\cdot e^b$.

Then $\sin t_0 \in \{-1,1\}$ and since $\sin 't =\cos t>0$ on $(0,t_0)$ we deduce that $\sin t_0=1$. Therefore $e^{i \cdot \pi/2}=i$, and this means $e^{2\pi i}=1$.

Beni Bogosel
  • 23,891
  • 5
    How do you know $\sin 2\pi=0$ starting from the power series? – Mark Bennet Sep 09 '11 at 14:14
  • @Mark Bennet: Under the line I have shown how to prove that. As you can see $\pi$ is defined here to satisfy the necessary conditions. – Beni Bogosel Sep 09 '11 at 14:40
  • indeed, and the proof looks nice. I think (see Hardy, who cited below) it is also easy to show Cos is monotone in the interval you use, and hence there is precisely one zero. – Mark Bennet Sep 09 '11 at 15:06
  • Well, if the function $\cos$ is continuous and $\cos 0=1, \cos 2<0$ there is surely one 'smallest' positive zero. – Beni Bogosel Sep 09 '11 at 15:09
  • indeed. Your way through misses this unnecessary step - but it is also quite easy. – Mark Bennet Sep 09 '11 at 19:38
  • For anyone that isn't good at math but reads this stack exchange anyway, to prove the $\cos(2) < -1/3$ inequality, letting $a_n = 2^n/(2n)!$ note that $a_{n + 1}/a_n = 2/(2n + 2)$ and this is less than 1 for all $n > 0$ so you can pair terms in the series expansion of $\cos(2)$ past the ones the answer wrote out to get that they are all negative and thus the inequality must be true. – MathNeophyte Oct 04 '24 at 22:56
  • And the claim that there is a "smallest" $t_0$ is not actually that trivial or benign here, since the intermediate value theorem just tells you that there is a zero. You have to argue that since the set of positive zeroes of $\cos$ is bounded below (by $0$), it must have an infimum (by the completeness of $\mathbb{R}$), and then you must argue by continuity and the $\varepsilon$ formulation of the definition of the infimum that the value of $\cos$ at this inf must also be $0$, meaning it is in fact in the set (so a minimum). This could also trip a beginner up. – MathNeophyte Oct 04 '24 at 23:01
  • Also, it just occurred to me that strictly speaking, you've only shown that theres a period and its at most $2\pi$. To show that theres no smaller period $p$, suppose there is. Then $e^\frac{p}{4}$ is complex 4th root of unity, so just by factoring $z^4 - 1 = 0$ we know its in ${\pm1, \pm i}$, but since $p/4 < \pi/2 = t_0$ we know $\cos(p/4) > 0$ so we have $\cos(p/4) = 1$ contradicting the fact that $\cos$ is strictly less than $1$ on $(0, t_0)$. To argue this, you might need to use the differential analysis you have avoided (can't simply show $\cos(\pi/2 - \varepsilon) < 1$ from series). – MathNeophyte Oct 05 '24 at 03:16
  • I mean $e^{ip/4}$ ofc not $e^{p/4}$ – MathNeophyte Oct 05 '24 at 04:23
  • How did you know that ${e^{a}}\cdot {e^b}=e^{a+b}$ from its series expansion? – Mr. W Jan 03 '25 at 11:01
14

GH Hardy sketches a proof as follows (Hardy, Pure Mathematics, Section 224)

  1. Use the sequences, which are absolutely convergent for all real $x$ to demonstrate the addition formulae (e.g. $\cos(a+b)=\cos(a)\cos(b)-\sin(a)\sin(b)$)

    He then says "The property of periodicity is a little more troublesome."

  2. Prove from the series for cosine that it changes sign just once in the interval (0,2) [I think this is really the key insight in the proof]

  3. Call the zero $\pi/2$ and show that $\sin(\pi/2)=1$, $\cos(\pi)=-1$, $\sin(\pi)=0$

  4. Use the addition formulae to establish periodicity.

Hardy cites a full proof in Whittaker and Watson's Modern Analysis, Appendix A.

Srivatsan
  • 26,761
Mark Bennet
  • 101,769
9

Start from the end of Beni Bogosel's first paragraph: We must find a positive $t$ such that $e^{it}=1$. Go through the usual proof that $$ \sum_{k=0}^{\infty} \frac{z^k}{k!} = \lim_{n\to\infty} \left(1+ \frac{z}{n}\right)^n$$

So $e^{it} = \displaystyle\lim_{n\to\infty} \left( 1+\frac{it}{n} \right)^n$. Since $1\leq \displaystyle\left(1+\frac{t^2}{n^2}\right)^n \leq \left(e^{t^2}\right)^{1/n}\to 1$, $|e^{it}|=1$.

Define $\rm si(x), co(x), ta(x) $ to be the usual trigonometric functions defined by points on the unit circle. Draw a picture to convince yourself that $\rm si(x) \leq x \leq ta(x) $ for small positive $x$, and $\rm ta(x) \leq x \leq si(x)$ for small negative $x$. Argue via squeeze theorem that $\rm ta(x) \sim x $ as $x\to 0$.

Now note $\arg(e^{it}) = \displaystyle\lim_{n\to\infty} \rm n\cdot ta^{-1}(t/n) = t$ . Thus, $e^{it}$ is the point on the unit circle whose argument is $t$, so has period equal to the circumference of the unit circle, $2\pi$.

Note that we don't have to depart from our geometric definition of $\pi$, instead it arises naturally. As freebies, we get that $\sin x = \rm si(x)$ and $\cos x = \rm co(x)$, i.e these power series definitions agree with the historical definition with triangles.

Ragib Zaman
  • 36,065