4

Let that ${0 < p_0 < p_1 <\infty}$ and ${f \in L^{p_0}(X) \cap L^{p_1}(X)}$. Then ${f \in L^p(X)}$ for all ${p_0 \leq p \leq p_1}$, and furthermore we have $$ \displaystyle \| f\|_{L^{p_\theta}(X)} \leq \|f\|_{L^{p_0}(X)}^{1-\theta} \|f\|_{L^{p_1}(X)}^{\theta} $$ for all ${0 \leq \theta \leq 1}$, where the exponent ${p_\theta}$ is defined by ${1/p_\theta := (1-\theta)/p_0 + \theta/p_1}$.

One can of course prove this statement by Hölder's inequality. Here is another approach suggested in one of Terry Tao's real analysis notes. Consider the log-convexity inequality:

$$ \displaystyle |f(x)|^{p_\theta} \leq (1-\alpha) |f(x)|^{p_0} + \alpha |f(x)|^{p_1} $$ for all ${x}$, where ${0 < \alpha < 1}$ is the quantity such that ${p_\theta = (1-\alpha) p_0 + \alpha p_1}$. By integrating this inequality in ${x}$, one already obtains the claim in the normalised case when $${\|f\|_{L^{p_0}(X)} = \|f\|_{L^{p_1}(X)} = 1}. $$

Here is my question:

It is said that to obtain the general case, one can multiply the function ${f}$ and the measure ${\mu}$ by appropriately chosen constants to obtain the above normalisation. Would anybody elaborate how such normalisation could be done?

1 Answers1

4

Assume that $\|f\|_{L^{p_0}}=A_0$ and $\|f\|_{L^{p_1}}=A_1$. This means $$ \int \left|f(x)\right|^{p_0}\,d\mu = A_0^{p_0},\qquad \int \left|f(x)\right|^{p_1}\,d\mu = A_1^{p_1} $$ and we want to find two constants $C,D$ such that by imposing $\hat{f}=Cf$ and $\hat{\mu}=D\mu$ we have $$ \int \left|\hat{f}(x)\right|^{p_0}\,d\hat{\mu}=\int \left|\hat{f}(x)\right|^{p_1}\,d\hat{\mu}=1 $$ that means $$ C^{p_0} D A_0^{p_0} = C^{p_1} D A_1^{p_1} = 1 $$ the first equality is granted by $C^{p_1-p_0}=\frac{A_0^{p_0}}{A_1^{p_1}}$ and once we choose $C$ we may choose $D$ to grant the second equality too.

Jack D'Aurizio
  • 361,689