Reading through Rudin's Real and Complex Analysis, I came across the following exercise:
Suppose $(f_n: X \to [0,\infty])$ is a monotone decreasing sequence of measurable functions such that $\lim\limits_{n \to \infty} f_n(x) = f(x)$ for all $x \in X$. Prove that if $f_1 \in L^1(\mu)$, then
$$\lim\limits_{n \to \infty}\int\limits_{X} f_n \, \mathrm{d}\mu = \int\limits_X f \, \mathrm{d}\mu.$$
It seems like this should be a trivial application of the Dominated Convergence Theorem, taking $f_1$ to be the dominating function. But it seems like an exercise would not be so trivial as to merit basically a one line proof. Is there a reason that DCT fails to be applicable here?