0

can you help me solving the following problem:

From the Lebesgue differentiation theorem we can conclude that if $f\in L_{loc}^{1}(\Omega)$ then, $$f(x)=\lim_{r\to 0}\frac{1}{|B(x, r)|}\int_{B(x, r)}f(y)\ dy,$$ for almos every point $x\in \Omega$, where $\Omega\subseteq \mathbb R^n$ is an open set.

Using this how can I show the following:

Let $f\in L^1(\mathbb R)$ such that:

(a) $f(x+y)=f(x)f(y)$.

(b) $f(x+k)=f(x)$ for all $k\in\mathbb Z$.

(c) $\int_{0}^{\lambda}f(\tau)\ d\tau=0$ for all $\lambda>0$.

Then $f=0$ almost everywhere in $\mathbb R$. .. Maybe not all this hypothesis will be necessary.. Thanks

PtF
  • 9,895

2 Answers2

1

For a.e. $x\in\mathbb{R}$ s.t. $x > 0$ $$ f(x)=\lim_{\lambda\to 0^+}\frac{1}{2\lambda}\int_{(x-\lambda, x+\lambda)}f(\tau)\ d\tau $$ with the original theorem you cite. But with (c), $$ \int_{x-\lambda, x+\lambda}f(\tau)\ d\tau = \int_{(0, x+\lambda)}f(\tau)\ d\tau - \int_{(0, x-\lambda)}f(\tau)\ d\tau = 0 - 0 = 0 $$ as long as $\lambda$ is small enough (so that $x > \lambda$. You get that for a.e. $x > 0$, $f(x)=0$.

Then, with (b), writing $x < 0$ as $x=k+y$ for $k\in\mathbb{Z}$ and $y>0$, you can conclude for a.e; $x\in\mathbb{R}$.

Clement C.
  • 68,437
1

You can also do this without the Lebesgue differentiation theorem. It follows from assumption (c) that $f(x) = 0$ for almost every $x > 0$; see this question. Combining this with (b) shows the same holds for $x \le 0$. Assumption (a) is not needed.

Nate Eldredge
  • 101,664
  • I think it's possible that any two of the three conditions are enough, though. For this combination it is the most obvious, of course. – tomasz Jul 24 '13 at 18:02