17

Let $X, Y$ be two identically distributed (i.d.) positive random variables. If they are furthermore independent, from the Cauchy-Schwarz inequality (last step), one has

$$ E[X/Y] = E[X]\cdot E[1/Y] = E[X]\cdot E[1/X] \geq E[\sqrt{X}/\sqrt{X}]^2 = (E[\sqrt{X}/\sqrt{X}])^2 = 1 $$

If one doesn't assume independence of $X$ and $Y$ anymore (but one still assumes that $X\sim Y$), by Jensen inequality one gets $$ \log E[X/Y] \geq E[\log X - \log Y] $$ and the right hand side is $0$ if $\log X$ is assumed integrable, and this gives again $(*)\, E[X/Y] \geq 1$

I am pretty sure (still in the case where $X$ and $Y$ are identically distributed and possibly dependent) that $(*)$ holds true without the hypothesis $\log X \in L^1$, but my attempts don't work (I tried to replace $X$ and $Y$ by truncated versions, but the dominated convergence theorem doesn't work). Maybe there is another way to do it?

Amir
  • 11,124
dionysos
  • 695
  • LaTeX and MathJax usage note: $E[X]\cdot E[1/Y]$ differs typographically from $E[X].E[1/Y].$ The former uses \cdot. – Michael Hardy Mar 09 '25 at 16:59
  • Every time I see something like $\operatorname E[UV]^2$ I hesitate over whether it means $\big( \operatorname E[UV] \big)^2$ or $\operatorname E[(UV)^2],$ but as we see here, unambiguous notations exist. – Michael Hardy Mar 09 '25 at 17:15

4 Answers4

12

Let $X$ and $Y$ be positive, identically distributed RVs. WLOG we may assume $\mathbb{E}[X/Y] < \infty$. Then

Lemma 1. $\mathbb{E}[\log (X/Y)]$ exists in $[-\infty, \infty)$. Moreover, we have $$ \mathbb{E}[X/Y] \geq 1 + \mathbb{E}[\log (X/Y)]. $$

Proof. This is an immediate consequence of the inequality $1 + x \leq e^x$.

Now the key observation is as follows:

Lemma 2. Let $U$ and $V$ be arbitrary positive RVs such that $\mathbb{E}[\log(U/V)]$ exist in $[-\infty, +\infty]$. Then

$$ \mathbb{E}[\log(U/V)] = \int_{0}^{\infty} \frac{\mathbb{E}[e^{-Vs}] - \mathbb{E}[e^{-Us}]}{s} \, \mathrm{d}s. $$

Using Lemma 1 and 2 and noting that $\mathbb{E}[e^{-Ys}] = \mathbb{E}[e^{-Xs}]$, the desired inequality is easily obtained:

$$\begin{align*} \mathbb{E}[X/Y] &\geq 1 + \mathbb{E}[\log(X/Y)] \\ &= 1 + \int_{0}^{\infty} \frac{\mathbb{E}[e^{-Ys}] - \mathbb{E}[e^{-Xs}]}{s} \, \mathrm{d}s \\ &= 1 \end{align*}$$

Remark. When it comes to answering OP's question, Lemma 2 is way too powerful because it applies to a much more general situation. Utilizing the identical distribution condition, we can come up with a more elementary substitute, for example this answer, for our purpose.


Proof of Lemma 2. For each $x \in \mathbb{R}$, let $x_+ = 0 \vee x$ and $x_- = 0 \vee (-x)$ be the positive and negative part of $x$, respectively. Then by the Frullani's integral and the Tonelli's theorem,

$$\begin{align*} \mathbb{E}[(\log(U/V))_+] &= \mathbb{E}[\log(U \vee V) - \log V] \\ &= \mathbb{E}\left[ \int_{0}^{\infty} \frac{e^{-Vs} - e^{-(U\vee V)s}}{s} \, \mathrm{d}s \right] \\ &= \int_{0}^{\infty} \frac{\mathbb{E}[e^{-Vs}] - \mathbb{E}[e^{-(U\vee V)s}]}{s} \, \mathrm{d}s. \end{align*}$$

By the same reasoning, we also have

$$\begin{align*} \mathbb{E}[(\log(U/V))_-] &= \int_{0}^{\infty} \frac{\mathbb{E}[e^{-Us}] - \mathbb{E}[e^{-(U\vee V)s}]}{s} \, \mathrm{d}s. \end{align*}$$

However, since $\mathbb{E}[\log (U/V)]$ is assumed to exist in $[-\infty, +\infty]$, at least one of the above integrals is finite. Hence, their difference is well-defined, yielding

$$\begin{align*} \mathbb{E}[\log(U/V)] &= \mathbb{E}[(\log(U/V))_+] - \mathbb{E}[(\log(U/V))_-] \\ &= \int_{0}^{\infty} \frac{\mathbb{E}[e^{-Vs}] - \mathbb{E}[e^{-Us}]}{s} \, \mathrm{d}s \end{align*}$$

as required.

Sangchul Lee
  • 181,930
  • It seems in Lemma 2, it is implicitly assumed that the moment generating function of both $U$ and $V$ is finite for negative reals. – Amir Mar 07 '25 at 07:43
  • 3
    @Amir, The Laplace transform $\mathbb{E}[e^{-sU}]$ exists for all $s \geq 0$ since $U$ is assumed to be positive. (Note that $e^{-sU} \in (0, 1]$ for all $s\geq0$.) This does not necessarily imply that MGF of $U$ exists on a neighborhood of $s=0$, but this has nothing to do with my answer. – Sangchul Lee Mar 07 '25 at 07:48
  • @SangchulLee very creative argument, nicely done! – user159517 Mar 07 '25 at 08:54
  • @SangchulLee thanks for your proof, it is very nice. There is one detail I don't understand, in lemma 1: what is the justification of the existence of the mean of $\log(X/Y)$ (the upper bound $\log(X/Y) \leq -1 + X/Y$ suffices only if the l.h.s. is nonnegative, or am I missing sth?) – dionysos Mar 07 '25 at 10:08
  • 1
    @dionysos, Here, the existence of expectation is examined in "extended" sense, allowing taking values in $[-\infty, +\infty]$. Then, if $U$ and $V$ are RVs such that $U \leq V$ and $V$ is integrable, then we know that $$\color{gray}{\mathbb{E}[U] = } \underbrace{\mathbf{E}[V]}{\in\mathbb{R}} - \underbrace{\mathbf{E}[V-U]}{\in[0,+\infty]}$$ is a well-defined element in $[-\infty, +\infty)$. – Sangchul Lee Mar 07 '25 at 10:19
  • 1
    @SangchulLee thank you! I got it... – dionysos Mar 07 '25 at 12:03
  • I am still wondering whether a more elementary proof can be given. Only for the case $(X,Y)\sim (Y,X)$, a simple one is available by symmetry. – Amir Mar 07 '25 at 16:21
  • 1
    @Amir, The main difficulty is the lack of information on the joint distribution of $X$ and $Y$. So, the best thing we can hope is to disentangle them so that their joint distribution is no longer needed. Now, an obvious way of disentangling the ration $X/Y$ is to take logarithm, so I think it is not easy to avoid using logarithm.

    On the other hand, I have a naive dream that the rearrangement inequality might possibly be utilized in a clever way to give another proof, although I am not certain.

    – Sangchul Lee Mar 07 '25 at 18:34
  • Could you provide more details on how $\mathbb{E}[X] \in [-\infty, +\infty]$ implies that at least one of $\mathbb{E}[X_+]$ and $\mathbb{E}[X_-]$ is finite? – Amir Mar 09 '25 at 09:54
  • @Amir, It is more or less by definition. In measure theory, we first define the abstract integral for non-negative functions, and then extend it to $\mathbb{R}$-valued functions so that at least one of the positive parts of negative parts integrate to a finite value. – Sangchul Lee Mar 09 '25 at 10:09
  • I got it. Actually, Lebesgue integral is used here – Amir Mar 09 '25 at 10:32
  • @SangchulLee May I ask why the Laplace transform $\mathbb{E}\left[e^{-s U}\right]$ exists for all $s \geq 0$? According to the existence criteria for Laplace transforms, one of the prerequisites is that the function being integrated must be of Exponential order (referring to https://www.cs.ucr.edu/~craigs/ucla-courses/135.2.16s/laplace-existence.pdf), so does this mean that $U$ and $V$ need to satisfy the Exponential order prerequisite? Is there something I'm misunderstanding? Thank you very much for any explanation. – Yilin Cheng Mar 17 '25 at 06:51
  • @YilinCheng, Here, what is "Laplace-transformed" is not $U$ but its distribution: $$\mathbb{E}[e^{-sU}]=\int_{[0,\infty)}e^{-su},\mathbb{P}(U\in\mathrm{d}u).$$ The right-hand side always exists for $s\geq 0$ whenever $U$ is a non-negative random variable. But we don't need this fancy machinery to show the existence, because it is almost obvious from the expression $\mathbb{E}[e^{-sU}]$ itself. Note that $e^{-sU} \in [0, 1]$ whenever $s\geq 0$ and $U \geq 0$, and so, $0\leq\mathbb{E}[e^{-sU}]\leq 1$ as well! – Sangchul Lee Mar 17 '25 at 08:20
  • @SangchulLee I understand now. My mistake in thinking was: 'I thought Exponential order was a condition that must be satisfied.' Now, please check if my subsequent understanding is correct? : $\mathbb{E}\left[e^{-s U}\right]=\int_0^{+\infty} e^{-s u} P{(u)} d u \leq \int_0^{+\infty} P{(u)} d u=1$, where $P(u)$ is the probability density function that $U$ follows, and the inequality is obtained based on $e^{-s U} \in[0,1]$. Looking forward to your reply, thank you very much. – Yilin Cheng Mar 17 '25 at 09:00
  • @YilinCheng, Indeed you are correct! :) – Sangchul Lee Mar 17 '25 at 09:22
  • @SangchulLee HaHa, thank you very much. – Yilin Cheng Mar 17 '25 at 09:53
5

By Jensen inequality and the result discussed here, one gets

$$ \log \mathbb E[X/Y] \geq \mathbb E[\log (X/Y)]=\mathbb E[\log X - \log Y]=0 $$

considering that $\log X \sim \log Y$ and $\mathbb E[\log (X/Y)]$ exists (because $\log (X/Y)$ is bounded from above by $X/Y -1$, whose expectation exits by assumption).

Amir
  • 11,124
  • This answer follows the argument given in the OP and is inspired by Lee's answer. – Amir Mar 09 '25 at 16:43
  • (+1) Just a few things: it seems that from the assumption we have that $E[X/Y]$ exists in $(0,\infty]$ which is enough for your argument. From your posting, it seems that you are saying that $X/Y-1$ is integrable, which I believe is not what you intend to say. The period at the end of you equation makes the argument a little confusing. – Mittens Mar 09 '25 at 17:17
  • @Mittens Thanks I just removed the period, which was a typo. The assumption is exactly what you stated: $\mathbb E[X/Y]$ exists, which implies that $X/Y-1$ is integrable. – Amir Mar 09 '25 at 17:22
  • 2
    @Amir Thanks for this contribution, and thanks to Lee again: in this other discussion, a completely independent solution using only Fatou (and a simple convexity argument) is finally proposed; it is what I was looking for (I did'nt think to "reverse" Fatou lemma...) – dionysos Mar 10 '25 at 09:40
  • @dionysos your nice observation has initiated developing some other basic and useful results, personally haven't seen before. Thank you, Lee, and Robert. I hope more interesting connections will be revealed. – Amir Mar 10 '25 at 11:00
0

One can approximate the integral above with Lebesgue sums. So one could show that every (conveniently chosen) Lebesgue sum has value $\ge 1$. For this, it is enough to check the following: consider a square $n\times n$ matrix $(\rho_{ij})$, $\sum_{i,j} \rho_{ij} = 1$, with the property that for every $1\le i \le n$ we have $\sum_{j} \rho_{i j} = \sum_{j} \rho_{j i}$, and let $x_1$, $\ldots$, $x_n$ be numbers $>0$. Then we have the inequality:

$$\sum_{ij}\rho_{ij} \frac{x_i}{x_j} \ge 1$$

Indeed, the sum is $\ge $ ( mean inequality)

$$\prod_{ij} \left( \frac{x_i}{x_j}\right)^{\rho_{ij}}$$

and this product is in fact $1$ ( check that the exponent of every $x_i$ is in fact $0$)

Note: it may not be the most straighforward proof but it avoids some convergence problems.

orangeskid
  • 56,630
-3

Here is a more elegant solution (assuming independent continuous variables with distribution function $f$):

$$ E[\frac{X}{Y}]=\iint_{x<y}\frac{x}{y}f(x)f(y)dx dy+\iint_{y<x}\frac{x}{y}f(x)f(y)dx dy \\\stackrel{symmetry}{=} \iint_{x<y}\frac{x}{y}f(x)f(y)dx dy+\iint_{x<y}\frac{y}{x}f(x)f(y)dx dy\\ =\iint_{x<y}\left(\frac{x}{y}+\frac{y}{x}\right)f(x)f(y)dx dy $$

The AM-GM inequality gives for any $x,y>0$ that $\frac{\frac{x}{y}+\frac{y}{x}}{2}\geq 1$. This yields $$ E[\frac{X}{Y}]\geq \iint_{x<y}2f(x)f(y)dx dy \stackrel{symmetry}{=}\iint_{x,y} f(x)f(y)dx dy =1$$