6

Problem:

Find a non-negative function $f$ on $[0,1]$ such that $$\lim_{t\to\infty} t\cdot m(\{x : f(x) \geq t\}) = 0,$$ but $f$ is not integrable, where $m$ is Lebesgue measure.

My Attempt:

Let $f(x) = \frac{\chi_{(0,1]}}{\sqrt{x}}$. Then, \begin{align*} \lim_{t\to\infty} t\cdot m(\{x \in [0,1]: f(x) \geq t\}) &= \lim_{t\to\infty} t \cdot m(\{x\in (0,1]: 1/\sqrt{x} \geq t\})\\ &=\lim_{t\to\infty} t \cdot m(\{x\in (0,1]: x \leq (1/t^2)\})\\ &= \lim_{t\to\infty} t \cdot m((0, (1/t^2)))\\ &= \lim_{t\to\infty} \frac{t}{t^2} = 0. \end{align*}

However, $f(x)$ is integrable over this interval. I have tried functions looking like $f(x) = 1/x^p$ but I cannot find any that will work here. There is a hint that says there is a monotonic function that fits this description.

Also, does anyone know of a list of non-Lebesgue integrable functions on $[0,1]$? I feel as though I could use this for many counterexamples if one were to exist. Thanks!

dannum
  • 2,589
  • 2
    $-1$ is the largest $p$ such that your limit does not hold with $x^p$ (since the resulting limit is $1$), and it isn't integrable. So $x^{-1+\varepsilon}$ diverges too slow for all $\varepsilon > 0$ but $x^{-1}$ diverges too fast. So can you make a function in between? – Ian Jul 19 '14 at 22:01
  • I'm sure that there is one, but I cannot think of it. I've spent some time looking at functions like $f(x) = \frac{x}{1-x^2}$, but I believe the limit will go to $\infty$ rather than $0$. – dannum Jul 20 '14 at 00:15
  • 2
    Add $1/(x^p \log^q x)$ to your arsenal of examples. The logarithmic term is relatively minor, but it can sway integrability one way or the other when the function is right at the edge of being integrable (see the comment by Ian). –  Jul 20 '14 at 06:24

2 Answers2

5

The example given in the previous answer was not quite accurate and in any case overcomplicated. Let $I_n$ denote the interval $(\delta_{n+1},\delta_n)$, where $\delta_n=2^{-n}/n$ and let $$ f(x)=\sum_{n\geqslant 1}2^n\mathbf{1}_{I_n}(x). $$ Then $f$ is non-negative, $2^km(f\geqslant 2^k)=\sum_{n\geqslant k}m(I_n)=\frac 1k$ and $$ \int fdm=\sum_{n\geqslant 1 }2^n m(I_n) $$ and $2^n m(I_n)$ behaves like $1/n$.

Davide Giraudo
  • 181,608
  • 2
    After looking at the graph of this function, it looks like you should find two points, $x_t$, for large enough $t$ and not just one. Is there a way to still justify that the measure of this set is actually $x_t$? Solving this equation for $x$ is not exactly an easy thing to do. – dannum Jul 20 '14 at 16:08
  • We have to compute the measure of the set ${x,-x\log x\leqslant 1/t}$, and we use a monotonicity property of $t\mapsto -t\log t$. – Davide Giraudo Jul 20 '14 at 17:27
  • 2
    Yes I see that as $t$ increases $-t\log(t) \to -\infty$, but what I don't understand is how $m({x:f(x) \geq t}) = x_t$ since there is more than one point in (0,1) such that $-x\log(x) = 1/t$. Sorry if I am oblivious to some obvious fact. – dannum Jul 20 '14 at 18:32
  • @danielson you might be right.For large t, there are two roots, $x_t$ being the one close to $0$, (equal to $x_t$ as above), $y_t$ being the other one close to $1$, then $m(f\geq t) = x_t + 1 - y_t$, by the same argument, we have $$ t\cdot x_t \longrightarrow 0 $$ while $$ t\cdot [ 1 - y_t ]\longrightarrow +\infty . $$ So $$ t\cdot m(f\geq t) \longrightarrow +\infty $$ which is not what we wanted. – Chival Mar 08 '15 at 21:49
  • 2
    @DavideGiraudo A possible modification could be given as follows: define $g(x) = - x\log x$ for $x\in[0, e^{-1}]$ and $g(x) = e^{-1}$ for $x\in[e^{-1}, 1]$. We set $f(x) = g(x)^{-1} \cdot 1_{(0 < x < 1)}$. Then $f$ is not integrable while for large $t$, there is only one root to the equation $g(x) = 1/t$. Hence your argument could succeed. – Chival Mar 08 '15 at 21:58
  • (+1) @DavideGiraudo: Hi Davide, this is a very old question and I do like your counter example. However, it does not quite work, as the singularity at $1$ grows a little slower an accumulates mass at a rate $1$. Any simple truncation (or elimination) of your example will work, for example $g(x)=\frac{1}{x|\log x|}\mathbb{1}){(0,e^{-1}]}(x)$. – Mittens Feb 08 '22 at 22:41
  • 1
    I actually modified the function, as I found it too complicated. Good that you could modify it a bit. – Davide Giraudo Feb 09 '22 at 10:31
2

Just to show that the function $f(x)=\frac{1}{x|\log x|}\mathbb{1}_{(0,1)}(x)$ proposed by Davide Giraudo does not quite do the trick, but simple modifications (suggested by @Chival) of it will do.

The issue is the singularity at $1$.

Notice that the function $f(x)=\frac{1}{x|\log x|}$ is convex on $(0,1)$, attains a minimum value ($e$) at $t=e^{-1}$ and $$\lim_{x\rightarrow0+}f(x)=\lim_{x\rightarrow1-}f(x)=\infty$$ For any $t>e$, there are exactly two points $0<a_t<e^{-1}<b_t<1$ such that $f(a_t)=f(b_t)=t$. In fact, $$a_t\xrightarrow{t\rightarrow\infty}0,\qquad b_t\xrightarrow{t\rightarrow\infty}1$$ by the convexity and monoticity of $f$ on $(0,e^{-1})$ and $(e^{-1},1)$. Then $$t\,m(f>t)=t\,m\big((0,a_t)\cup(b_t,1)\big)=ta_t+t(1-b_t)\xrightarrow{t\rightarrow\infty}1$$ for $$ta_t=\frac{1}{|\log a_t|}\xrightarrow{t\rightarrow\infty\infty}0,$$ and $$t(1-b_t)=f(b_t)(1-b_t)=-\frac{1-b_t}{b_t\log b_t}\xrightarrow{t\rightarrow\infty}1.$$ The last bit follows by a simple application of L'Hospital theorem: $$\frac{b-1}{b\log b}\sim \frac{1}{1+\log b}\xrightarrow{b\rightarrow1-}1$$


Notice that any truncation of $f$, say $g(x)=\frac{1}{x|\log x|}\mathbb{1}_{(0,e^{-1})}$, will do the job.

Mittens
  • 46,352