5

This is actually Problem $ 17 $ from Chapter $ 10 $ of the Fourth Edition of Michael Spivak’s Calculus. The statement is quite simple, but I have not had any success in finding an example. Here is the statement:

Problem. Give an example of functions $ f: \Bbb{R} \to \Bbb{R} $ and $ g: \Bbb{R} \to \Bbb{R} $ such that $ g $ takes on all values (i.e., is surjective), and $ f \circ g $ and $ g $ are differentiable, but $ f $ is not differentiable.

Note: I want to assume that $ f $ is nowhere differentiable, otherwise the problem is quite easy.

Note that this is in the second chapter of limits, which means that simple examples are expected. No functions defined in terms of integrals or power series should be necessary!

  • 2
    Is $f$ supposed to be nowhere differentiable? – Brian Tung May 08 '15 at 06:30
  • This was not clear to me either. I have copied the exercise as it appears on the book and I tried to assume it was nowhere differentiable. – Orest Xherija May 08 '15 at 22:37
  • May we assume that the audience is familiar/comfortable with things like the Weierstrass function? – Brian Tung May 08 '15 at 22:52
  • Probably not. That was the first thing that came to my mind but this problem is way before power series are introduced. I am starting to wonder whether this is a badly worded problem... – Orest Xherija May 09 '15 at 00:44
  • It sort of has to be, if Weierstrass is out of bounds. How would they know what nowhere differentiable means? – Brian Tung May 09 '15 at 01:29
  • Isn't the Dirichlet function the canonical example of a nowhere continuous, nowhere differentiable function? http://mathworld.wolfram.com/DirichletFunction.html Considering it blows the mind of many college senior level students, we may need to simplify it. Hmmmm, difficult to use all of R as the domain and make the composition still differentiable. – nickalh May 09 '15 at 02:17
  • Can we use Q or the irrationals as the domain of f instead of R? – nickalh May 09 '15 at 02:22
  • @BrianTung: This is a badly worded problem. All that Spivak wants is for $ f $ to be not differentiable at some point, not all points. I suspect that it’s an extremely hard problem to determine whether or not an $ f $ exists that isn’t differentiable anywhere — a problem that belongs to the realm of advanced real analysis and point-set topology, not calculus. Spivak’s book has been called crazy by some, but it can’t be that crazy! – Berrick Caleb Fillmore May 10 '15 at 18:07
  • @Berrick: Well, I'm not the OP, but yes, I would agree with your assessment. (Not having the book, admittedly.) – Brian Tung May 10 '15 at 20:40

2 Answers2

10

The following proposition provides only a partial response to the latest edit.

Proposition. There does not exist an $ f: \Bbb{R} \to \Bbb{R} $ that fits the stronger requirement reflected in the OP’s latest edit if we assume $ g $ to be continuously differentiable.

Proof

As $ g $ is, by assumption, surjective and differentiable everywhere, there exists an $ a \in \Bbb{R} $ such that $ g'(a) \neq 0 $. Then by the further assumption that $ g': \Bbb{R} \to \Bbb{R} $ is continuous, the Inverse Function Theorem says that we can find an open interval $ I $ containing $ a $ satisfying the following:

  • $ g[I] $ is an open interval containing $ g(a) $.
  • $ g|_{I}: I \to g[I] $ is invertible.
  • $ (g|_{I})^{-1}: g[I] \to I $ is differentiable.

We can now write $ f|_{g[I]} $ as the composition of two differentiable functions: $$ f|_{g[I]} = (f \circ g) \circ (g|_{I})^{-1}. $$ By the Chain Rule, $ f|_{g[I]} $ is differentiable on $ g[I] $, so we conclude that $ f $ is differentiable on some open interval at least (if $ g' $ is continuous). $ \quad \blacksquare $


Latest Edit

This latest edit, although it does not answer the question in its entirety, shows that $ f $ cannot be badly behaved everywhere.

Theorem. Let $ f: \Bbb{R} \to \Bbb{R} $ and $ g: \Bbb{R} \to \Bbb{R} $ be surjective functions such that both $ g $ and $ f \circ g $ are differentiable everywhere. Then there are uncountably many $ a \in \Bbb{R} $ such that $ f $ possesses a one-sided derivative at $ g(a) $.

Proof

As $ g $ is not constant on $ \Bbb{R} $, there exist by Darboux’s Theorem uncountably many $ a \in \Bbb{R} $ such that $ g'(a) \neq 0 $. Fix such an $ a $, and choose a $ \Delta > 0 $ such that $$ \forall h \in [- \Delta,\Delta] \setminus \{ 0 \}: \quad \frac{g(a + h) - g(a)}{h} \neq 0. $$ In particular, $ g(a + h) - g(a) \neq 0 $ for all $ [- \Delta,\Delta] \setminus \{ 0 \} $. As there is no danger of dividing by $ 0 $, we thus obtain \begin{align} \forall h \in [- \Delta,\Delta] \setminus \{ 0 \}: \qquad ~ & \frac{(f \circ g)(a + h) - (f \circ g)(a)}{g(a + h) - g(a)} \cdot \frac{g(a + h) - g(a)}{h} \\ = ~ & \frac{(f \circ g)(a + h) - (f \circ g)(a)}{h}. \end{align} Equivalently, \begin{align} (\spadesuit) \qquad \forall h \in [- \Delta,\Delta] \setminus \{ 0 \}: \qquad ~ & \frac{f(g(a + h)) - f(g(a))}{g(a + h) - g(a)} \\ = ~ & \frac{(f \circ g)(a + h) - (f \circ g)(a)}{h} \cdot \frac{1}{\left[ \frac{g(a + h) - g(a)}{h} \right]}. \end{align}

Define a function $ I: (0,\Delta] \to \mathcal{P}(\Bbb{R}) $ by $$ \forall \delta \in (0,\Delta]: \quad I(\delta) \stackrel{\text{df}}{=} \{ g(a + h) \in \Bbb{R} \mid h \in [- \delta,\delta] \}. $$ The Intermediate Value Theorem tells us that for each $ \delta \in (0,\Delta] $, the continuity of $ g $ guarantees that $ I(\delta) $ is a closed bounded interval, and as $ g(a + h) \neq g(a) $ for any $ h \in [- \delta,\delta] \setminus \{ 0 \} $, we see that $ I(\delta) $ is also non-degenerate, i.e, it contains points other than $ g(a) $. Next, define \begin{align} L & \stackrel{\text{df}}{=} \{ \delta \in (0,\Delta] \mid I(\delta) \cap (- \infty,g(a)) \neq \varnothing \}, \\ R & \stackrel{\text{df}}{=} \{ \delta \in (0,\Delta] \mid I(\delta) \cap (g(a),\infty) \neq \varnothing \}. \end{align} By the foregoing discussion, we have $ L \cup R = (0,\Delta] $. Hence, either

  1. $ 0 $ is a limit point of $ L $, or
  2. $ 0 $ is a limit point of $ R $.

Without any loss of generality, we may assume it is Case (1) that occurs.

Note: The cases are not mutually exclusive, i.e., both could occur.

By our assumption that Case (1) occurs, we have $ I(\Delta) \cap (- \infty,g(a)) \neq \varnothing $. As such, let $ (y_{n})_{n \in \Bbb{N}} $ be any sequence in $ I(\Delta) \cap (- \infty,g(a)) $ that converges to $ g(a) $. We boldly claim that $$ \lim_{n \to \infty} \frac{f(y_{n}) - f(g(a))}{y_{n} - g(a)} = \frac{(f \circ g)'(a)}{g'(a)}, $$ which would imply that the left-derivative of $ f $ at $ g(a) $ exists.

Define a sequence $ (h_{n})_{n \in \Bbb{N}} $ in $ [- \Delta,\Delta] $ by $$ \forall n \in \Bbb{N}: \quad h_{n} \stackrel{\text{df}}{=} \text{A number $ h \in [- \Delta,\Delta] $ closest to $ 0 $ such that $ g(a + h) = y_{n} $}. $$ Such a $ h $ exists because $ {g^{\leftarrow}}[\{ y_{n} \}] \cap [a - \Delta,a + \Delta] $ is a non-empty compact subset of $ \Bbb{R} $. So as to avoid using the Axiom of Choice at this stage, we choose $ h_{n} $ to be positive whenever possible.

We argue that $ \displaystyle \lim_{n \to \infty} h_{n} = 0 $. Assume the contrary. Then we can find an $ \epsilon > 0 $ and a subsequence $ (h_{n_{k}})_{k \in \Bbb{N}} $ of $ (h_{n})_{n \in \Bbb{N}} $ such that $ |h_{n_{k}}| \geq \epsilon $ for all $ k \in \Bbb{N} $. Choose a $ \delta \in (0,\epsilon) \cap L $ (this is where the assumption of Case (1) plays a role). As $ I(\delta) \cap (- \infty,g(a)) = [m,g(a)) $ for some $ m < g(a) $, we are able to find a $ K \in \Bbb{N} $ sufficiently large so that $ y_{n_{K}} \in I(\delta) $. It follows that $ y_{n_{K}} = g(a + h) $ for some $ h \in [- \delta,\delta] \subseteq (- \epsilon,\epsilon) $, which makes $ h $ even closer to $ 0 $ than $ h_{n_{K}} $ is — a contradiction.

Using $ (\spadesuit) $ now, we therefore get \begin{align} \lim_{n \to \infty} \frac{f(y_{n}) - f(g(a))}{y_{n} - g(a)} & = \lim_{n \to \infty} \frac{f(g(a + h_{n})) - f(g(a))}{g(a + h_{n}) - g(a)} \\ & = \lim_{n \to \infty} \frac{(f \circ g)(a + h_{n}) - (f \circ g)(a)}{h_{n}} \cdot \frac{1}{\left[ \frac{g(a + h_{n}) - g(a)}{h_{n}} \right]} \\ & = (f \circ g)'(a) \cdot \frac{1}{g'(a)} \qquad \left( \text{As $ \lim_{n \to \infty} h_{n} = 0 $.} \right) \\ & = \frac{(f \circ g)'(a)}{g'(a)}. \end{align} Therefore, $ f $ has a left-derivative at $ g(a) $. If we were to assume Case (2) instead, it would have a right-derivative at $ g(a) $. In any case, $ f $ has a one-sided derivative at $ g(a) $, and as we have shown in the beginning that there are uncountably many such $ a $, we are done. $ \quad \blacksquare $

2

One family of functions that works is $f(x)=|x|$ and $g(x)$ any monotonic, surjective, differentiable function with $g(0)=g'(0)=0$.

Suppose $g$ is monotonically increasing. Then $g(x)<0$ for all $x<0$ and $g(x)>0$ for all $x>0$. Thus, $f(g(x))=-g(x)$ for $x<0$ and $f(g(x))=g(x)$ for $x>0$. It follows that $f(g(x))$ is continuous and differentiable at $0$ from taking right- and left-hand limits. Of course, $f\circ g$ is differentiable everywhere else since $f$ is differentiable everywhere except at $0$. The case when $g$ is monotonically decreasing is analogous.

Showhat
  • 938