2

Would you please help me solve Problem 55 in Chapter 6 of Real Analysis, fifth edition, by Royden and Fitzpatrick, which I am self-studying. Chapter 6 deals with Lebesgue integration.

55. For $f$ and $g$ as in the statement of Theorem 17, is it possible to establish the change of variables formula (30) by directly verifying that $$\frac d{dx}\biggl[\int_c^{g(x)} f\,dm -\int_a^x [f\circ g]\cdot g'\,dm\biggr] = 0\text{ for almost all }x\in (a,b)?$$

Theorem 17 (Change of Variables) Let the function $f:[c,d]\to\mathbf R$ be integrable. If the function $g:[a,b]\to\mathbf R$ is increasing and absolutely continuous, $c = g(a)$ and $d = g(b)$, then $$\int_c^d f\,dm =\int_a^b [f\circ g]\cdot g'\,dm.\qquad (30)$$

Without loss of generality, we can assume that $f$ is non-negative. Then Problem 55 is the same as the fourth-edition problem Justifying the change of variables formula $\int_{g(a)}^{g(b)} f(y)dy = \int_a^b f(g(x))g'(x)dx$ for Lebesgue Integration except that $g$ is just increasing instead of strictly increasing.

In the fourth-edition problem, the answer is yes as the answer at the linked webpage shows.

My intuition, however, tells me that the answer to the fifth-edition problem (Problem 55) is no. If so, here is a possible way to begin an answer:

Define function $h:[a,b]\to\textbf R$ by $$h(x) =\int_c^{g(x)} f\,dm -\int_a^x [f\circ g]\cdot g'\,dm.$$ The claim is that if $h'(x) = 0$ for almost all $x$ in $(a,b)$, then $h(b) = 0$. But for the Cantor-Lebesgue function $\varphi:[0,1]\to [0,1]$, for example, $\varphi'(x) = 0$ for almost all $x$ in $(0,1)$, but $\varphi(1) = 1\ne 0$.

But what can I choose for $f$ and $g$ so that $h$ behaves like the Cantor-Lebesgue function?

user0
  • 3,567

1 Answers1

1

My answer to the question in Problem 55 is "Well, sort of. If one can directly verify that the derivative vanishes a.e.".

Let's have a look at what we need to do this, and what possible obstacles must be overcome.

  1. We need to show that $h$ is almost everywhere differentiable, and $h'(x) = 0$ almost everywhere.
  2. To infer $h(b) = 0$, or $h(x) \equiv 0$, from that, we need to show that $h$ is absolutely continuous.
  3. To show that $h$ is absolutely continuous, we first have to show that $h$ is well-defined, and finite everywhere.

Clearly, without having done 3, it's pointless to try and prove 1 and 2. So let's consider 3.

First, since $f$ is integrable, the function \begin{equation} F \colon y \mapsto \int_c^y f\,dm \end{equation} is well-defined, and absolutely continuous on $[c,d]$. Then of course $F\circ g$ is well-defined, and continuous. The part \begin{equation} \int_a^x [f\circ g]\cdot g'\,dm \end{equation} remains to be investigated. A necessary condition for it to be well-defined is that $t \mapsto f(g(t))g'(t)$ be Lebesgue-measurable. I refer to PhoemueX's answer here for that. One point we shall later need is that for a null set $N \subset [c,d]$ the function $\chi_N(g(t))\cdot g'(t)$ vanishes almost everywhere in $[a,b]$. The question of integrability of $[f\circ g]\cdot g'$ I shelve. I shall first assume that $0 \leqslant f(u) \leqslant K$ for all $u \in [c,d]$ with some $K > 0$. Then the integrability is clear because $0 \leqslant [f\circ g]\cdot g' \leqslant K\cdot g'$.

Coming to 2, since the second member of $h$ is an integral, that is of course absolutely continuous on $[a,b]$. The first member, $F\circ g$ is absolutely continuous, too. Let $\varepsilon > 0$. Since $F$ is absolutely continuous — by the current assumptions it is even Lipschitz, but that doesn't play a role here — there is an $\eta > 0$ such that \begin{equation} \sum_{i = 1}^n \lvert F(v_i) - F(u_i)\rvert \leqslant \varepsilon \end{equation} for all finite collections of disjoint open intervals $(u_i,v_i)$ with total length not exceeding $\eta$. Since $g$ is absolutely continuous there is a $\delta > 0$ such that \begin{equation} \sum_{i = 1}^n \bigl(g(t_i) - g(s_i)\bigr) \leqslant \eta \end{equation} for all finite collection of disjoint open intervals $(s_i,t_i)$ with total length not exceeding $\delta$. For such a collection, since $g$ is monotonic, $(g(s_i),g(t_i))$ is a collection of disjoint open intervals (some possibly empty) with total length $\leqslant \eta$, and therefore \begin{equation} \sum_{i = 1}^n \lvert F(g(t_i)) - F(g(s_i))\rvert \leqslant \varepsilon\,. \end{equation} Thus $F \circ g$ is absolutely continuous. (Here, $F$ is nondecreasing, so the modulus is not necessary, but the conclusion that $F\circ g$ is AC doesn't need the monotonicity of $F$.) Linear combinations of absolutely continuous functions are absolutely continuous, hence $h$ is AC.

Thus, coming to 1, we know already that $h$ is differentiable almost everywhere, and we only need to show $h'(x) = 0$ a.e. Now \begin{equation} \frac{d}{dx} \int_a^x [f\circ g]\cdot g'\,dm = f(g(x))g'(x) \end{equation} for almost all $x \in [a,b]$ is standard, and it remains to see that $(F\circ g)'(x) = f(g(x))g'(x)$ for almost all $x \in [a,b]$. Again by one direction of the FTC (Lebesgue version) there is a null set $N \subset [c,d]$ with $F'(u) = f(u)$ for all $u \in [c,d]\setminus N$. Thus we have $(F\circ g)'(x) = f(g(x))g'(x)$ for all $x \in [a,b]\setminus g^{-1}(N)$ at which $g$ is differentiable. However, $g^{-1}(N)$ need not be a null set, so we are not finished yet. But by the remark above, $g'(x) = 0$ for almost all $x \in g^{-1}(N)$. And since $F$ is Lipschitz-continuous, $F\circ g$ is differentiable with $(F\circ g)'(x) = 0$ for all $x$ with $g'(x) = 0$. At such points, clearly $(F\circ g)'(x) = f(g(x))g'(x)$. Hence indeed $(F\circ g)'(x) = f(g(x))g'(x)$ except on a null set where $g$ is not differentiable, and a null set contained in $g^{-1}(N)$.

So we have proved everything we needed for non-negative bounded $f$. To obtain the unbounded case from this, consider $f_n(u) = \min \{n, f(u)\}$ and the corresponding $F_n$. By the monotone convergence theorem, $F_n$ converges (monotonically, and by Dini the uniformly) to $F$, which then resolves the shelved point, $[f\circ g]\cdot g'$ is indeed integrable. Then it follows that $h$ is absolutely continuous, for in the proof of absolute continuity of $F\circ g$ we have not used the temporarily assumed boundedness. And $h_n$ converges uniformly to $h$, whence $h \equiv 0$. Then the general case follows from the decomposition $f = f^+ - f^-$ and the result for non-negative $f$. So we cannot construct an example where $h'(x) = 0$ a.e., and $h \not\equiv 0$.

But here we have not directly verified that $h'(x) = 0$ almost everywhere. That is clear at the $x \in [a,b] \setminus g^{-1}(N)$ where $g$ is differentiable. The problem is showing $h'(x) = 0$ a.e. on $g^{-1}(N)$. In the above, we used the assumption that $f$ be bounded to deduce $(F\circ g)'(x) = 0$ almost everywhere on $g^{-1}(N)$.

I don't see a way to prove that directly, using only that $F$ is AC (and monotonic) and $g$ is AC and monotonic (even if $g$ is strictly monotonic). After all, $F(u) = u^{\alpha}$ is AC (and monotonic) on $[0,1]$ for $\alpha > 0$, $g(t) = t^{\beta}$ is AC and strictly monotonic with $g'(0) = 0$ for $\beta > 1$, but $F\circ g \colon t \mapsto t^{\alpha\beta}$ only has $(F\circ g)'(0) = 0$ for $\alpha\beta > 1$. Of course that is a single point, but a priori it is not clear that such things can't happen on a set of positive measure.

Dermot Craddock
  • 2,341
  • 3
  • 15
  • I interpret Problem 55 as "If $h' = 0$ almost everywhere, then is $h(b) = 0$?", i.e., we do not have to prove your item 1. (Although for the fourth-edition problem, it is not too difficult to prove that $h' = 0$ almost everywhere.) – user0 Jun 30 '25 at 17:41
  • The answer to that is "Yes". That follows as soon as $h$ is recognised as absolutely continuous. – Dermot Craddock Jun 30 '25 at 17:43