6

Let $\ f:\mathbb{R}\to\mathbb{R} \ $ be a non constant, continuous and periodic function. Prove that $f$ has smallest/minimum period.

The definition of period that I work with is:

$p$ is a period of a function $f$ if $\ p\gt0 \ \land \ \forall x\in\mathbb{R}:f(x+p)=f(x) $.

Don Fanucci
  • 2,515
  • It might be worth considering what a contradiction would be were there to be no smallest period - could you think of what might happen? What kind of "period" would a constant function have? – Aaron May 11 '16 at 11:25

3 Answers3

10

Let $P$ be the set of periods of $f$. Using your definition, $P$ is non-empty and bounded below by $0$. Consider $p^*=\inf P$. Take $p_n \in P \to p^*$. Fix $x \in \mathbb R$. Then $x+p_n \to x+p^*$ and $f(x+p^*)= \lim f(x+p_n)=f(x)$.

If $p^*>0$, then $p^* \in P$ and so $p^* = \min P$.

If $p^*=0$, then we need to argue that $f$ is constant.

For a more conceptual approach, here is a roadmap:

  • The set of periods of a function is an additive subgroup of $\mathbb R$.

  • An additive subgroup of $\mathbb R$ is either cyclic or dense.

  • The set of periods of a continuous function is a closed set.

  • A continuous function with a dense set of periods is constant.

lhf
  • 221,500
5

Outline of the proof: Assume by contradiction that there is no smallest period. Use first the fact that the difference between two periods is also a period to show that you can find a decreasing sequence of periods which converge to 0.

This means that for each $\delta >0$ you can find some $T$ period such that

$$0 < T <\delta$$

Now, pick $x,y$ arbitrary.

Fix $\epsilon >0$, then there exists a $\delta$ such that for all $z$ with $$|y-z| < \delta \Rightarrow |f(y)-f(z)|<\epsilon$$

Pick some $0< T < \delta$.

Show now that there exists some $n \in \mathbb Z$ such that $|(x+nT)-y|<\delta$.

Then $$|f(x)-f(y)|=|f(x+nT)-f(y)| <\epsilon$$

Since this is true for all $\epsilon$ we get $f(x)=f(y)$. As those are arbitrary, you are done.

N. S.
  • 134,609
  • but the function does not necessarily has a sequence of periods which converge to 0. – Don Fanucci May 11 '16 at 11:59
  • @Lior Read again the first paragraph. If it doesn't have a smallest period, you can construct a decreasing convergent sequence $b_n$ of periods. Then $t_n=b_n-b_{n-1}$ is a sequence of periods which converges to $0$. – N. S. May 11 '16 at 12:03
1

Here's a rough outline: can you work out the details?

Suppose that $f$ did not have a smallest period, so that there were numbers $a_1,a_2,\dots$, positive and tending to zero, such that $f(x+a_n) = f(x)$ for all $x$. Then if $x\neq y$ were any two distinct points, we could make $x$ and $y + m\cdot a_n$ as close as we like by choosing appropriate values of $m$ and $n$. This implies, by the continuity of $f$ (why?), that $f$ must take the same value at $x$ and $y$, i.e., that $f$ is constant.

user134824
  • 12,694
  • 1
    thats wrong because the periods can be bounded by some positive non zero rational an thus your statement is wrong because there isn't any sequence of periods that tends to zero @user134824 – Don Fanucci May 11 '16 at 11:38
  • 1
    You can take the infimum of the periods and argue that the infimum will be the period by using continuity. This is similar to what user134824 wrote above except that he/she has argued for the case when the infimum is 0. – Aman May 11 '16 at 11:56