3

This post asked for help in proving the inequality \begin{align*} \left(\sum \limits_{k=1}^n (2k-1)\frac{k+1}{k}\right) \left( \sum \limits_{k=1}^n (2k-1)\frac{k}{k+1}\right) \le n^2 \left(\sum \limits_{k=1}^n \frac{k+1}{k}\right) \left( \sum \limits_{k=1}^n \frac{k}{k+1}\right) \end{align*} All answers, including my own, required manually checking the inequality for some cases. My included answer reformulated the problem in probability notation and only contained bits relevant answering the problem directly. I'm opening the discussion to see if (1) this approach can be improved to answer the problem directly without having to manually check some cases and (2) can this approach be proved for general probability distributions.

The Reformulation

If we declare $X$ to be a discrete uniform random variable on $\{1, \cdots, n\}$, then the inequality is equivalent to \begin{align*} \mathbb{E}\left[(2X-1)\frac{X+1}{X}\right]\mathbb{E}\left[(2X-1)\frac{X}{X+1}\right] \le \mathbb{E}[2X-1]\mathbb{E}\left[\frac{X+1}{X}\right]\mathbb{E}[2X-1]\mathbb{E}\left[\frac{X}{X+1}\right] \end{align*} If we call $f(x) = 2x-1, g_1(x) = \frac{x+1}{x}, g_2(x) = \frac{x}{x+1}$, then we would have that \begin{align*} \mathbb{E}[f(X)g_1(X)] \le \mathbb{E}[f(X)]\mathbb{E}[g_1(X)] \\ \mathbb{E}[f(X)g_2(X)] \ge \mathbb{E}[f(X)]\mathbb{E}[g_2(X)] \end{align*} by Chebyshev's sum inequality. So we see that the two terms are "fighting", with the former winning out in making the overall inequality $\le$.

Some Exploration

A natural first step to gain more information is considering a summation instead of a product: \begin{align*} \mathbb{E}[f(X)g_1(X)] + \mathbb{E}[f(X)g_2(X)] &= \mathbb{E}\left[f(X)\left(g_1(X) + g_2(X) \right)\right] \\ &\le \mathbb{E}\left[f(X)\right]\mathbb{E}[g_1(X) + g_2(X)] \tag{$\spadesuit$} \end{align*} since $g_1(x) + g_2(x) = \frac{x+1}{x} + \frac{x}{x+1}$ is decreasing and therefore we way apply Chebyshev's sum inequality again. This appears promising: we see that the sum of $\mathbb{E}[f(X)g_1(X)]$ and $\mathbb{E}[f(X)g_2(X)]$ is less than the sum of $\mathbb{E}[f(X)]\mathbb{E}[g_1(X)]$ and $\mathbb{E}[f(X)]\mathbb{E}[g_2(X)]$; we would like to extend this to the product. Unfortunately, this sum bound is too weak and I couldn't find any way to translate this result into something for the product.

Something more general

The inequality ($\spadesuit$) can in fact be made more general by considering any non-decreasing function $f(x)$. Therefore, we can bound with \begin{align*} \mathbb{E}[g_1(X)] + \mathbb{E}[g_2(X)] &\ge \sup_{f: f \nearrow} \frac{\mathbb{E}[f(X)g_1(X)] + \mathbb{E}[f(X)g_2(X)]}{\mathbb{E}[f(X)]} \\ &=\sup_{f: \mathbb{E}[f(X)]=1,f \nearrow} \mathbb{E}[f(X)g_1(X)] + \mathbb{E}[f(X)g_2(X)] \end{align*} where $f \nearrow$ means for non-decreasing $f$. For $X$ discrete uniform on $\{1, \cdots, n\}$, it is not difficult to see that $f(x) = n \mathbf{1}_{\{n\}}(x)$ maximizes the RHS, and therefore \begin{align*} \mathbb{E}[g_1(X)] + \mathbb{E}[g_2(X)] \ge \frac{n+1}{n} + \frac{n}{n+1} = 2 + \frac{1}{n(n+1)} \end{align*} This is certainly a stronger bound than one solved from conventional ways: \begin{align*} \mathbb{E}[g_1(X)] + \mathbb{E}[g_2(X)] &\ge 2\sqrt{\mathbb{E}[g_1(X)]\mathbb{E}[g_2(X)]} & \text{AM-GM}\\ &\ge2\sqrt{\mathbb{E}[g_1(X)g_2(X)]} & \text{Chebyshev's sum} \\ &=2 \end{align*}

Future directions

This post is meant to be quite open-ended and anything of insight is welcome.

Tom Chen
  • 4,748

0 Answers0