11

Let $X, X_i$, $1 \leq i \leq n$ be IID random variables such that:

$$\mathbb{E}X^2 = +\infty$$ I am trying to show that this implies a growth condition on $$\mathbb{E}\left[\max_{1\leq i \leq n}{|X_i|}\right].$$ Specifically, I want to show for any $\epsilon > 0$ there exists some $c = c(\epsilon)$ such that

$$\mathbb{E}\left[\max_{1 \leq i \leq n}|X_i|\right] \geq c n^{1/(2+\epsilon)}.$$

Our instructor suggested that we prove the statement using contraposition, e.g. we show that: $$ \liminf \frac{\mathbb{E}[\max_{1 \leq i \leq n}|X_i|]}{n^{1/(2+\epsilon)}} = 0 \implies \mathbb{E}[X^2] < +\infty $$ My attempt: The above condition is equivalent to saying that there exists a subsequence $n_k$ that: $$ \lim_{k \to \infty} \frac{\mathbb{E}[\max_{1 \leq i \leq n_k}|X_i|]}{n_k^{\frac{1}{2+\epsilon}}} = 0 $$ Let $a_n$ be any monotone increasing sequence of positive real numbers such that $\lim a_n = +\infty$. By the monotone convergence theorem and Fubini's theorem: $$ \mathbb{E}[X^2] = \int_0^\infty \mathbb{P}(X^2 > t) dt = \sum_{n=1}^\infty \int_{a_n}^{a_{n+1}}\mathbb{P}(X^2 > t) dt \leq \sum_{n=1}^\infty \underbrace{(a_{n+1} - a_n)}_{:= \Delta a_n} \mathbb{P}(X^2 > a_n) $$ Thus, it suffices to show that the sum on the right hand side is finite for $a_n$ appropriately chosen.

To use information about the maximum, we have for $t> 0$ and $M_n = \max_{1\leq i \leq n} |X_i|$ the running maximum that: $$ \mathbb{P}(M_n >t) = 1 - (\mathbb{P}(X\leq t))^n = 1 - (1 - \mathbb{P}(X > t))^n $$ Rearranging gives: $$ \mathbb{P}(X > t) = 1 - (1 - \underbrace{\mathbb{P}(M_n > t)}_{x})^{1/n} $$ The right hand side is monotone as a function of the argument $x$. Hence, applying Markov's inequality to $M_n$ gives: $$ \mathbb{P}(X > t) \leq 1 - \left(1 - \frac{\mathbb{E}[M_n]}{t}\right)^{1/n} $$ for any integer $n$ and nonnegative number $t$. Replacing $X$ with $X^2$ gives:

$$ \mathbb{P}(X^2 > t) = \mathbb{P}(X > t^{1/2}) \leq 1 - \left(1 - \frac{\mathbb{E}[M_n]}{t^{1/2}}\right)^{1/n} $$ Hence, for $a_k$ as above, we have: $$ \mathbb{E}[X^2] \leq \sum_{k=1}^{\infty} \left(1 - \left(1 - \frac{\mathbb{E}[M_n]}{a_k^{1/2}}\right)^{1/n}\right)\Delta a_k $$

My question now is - what is the right choice of sequence $a_n$? I was considering using the subsequence along which the quotient:

$$ \mathbb{E}[M_{n_k}]/{n_k}^{1/(2+\epsilon)} \to 0 $$

But this scares me a bit because I can't control the size of the increments $\delta n_k$ (the subsequence could be very sparse). One way I was thinking of dealing with that was through summation by parts, but I'm not sure if that would do anything.

Any hints or ideas would be appreciated!

Davide Giraudo
  • 181,608
rubikscube09
  • 4,185
  • 3
  • 21
  • 55
  • 1
    You are absolutely right: the statement, as you wrote it, is false. All you can do is to show that the desired inequality holds along some subsequence (with the alternative that the reverse inequality holds for all $n$, which you know how to refute). – fedja Dec 04 '24 at 00:48
  • Is it so clear that it does not hold? For Pareto distributions, that is, densities of the form $t^{-\alpha-1}\alpha\mathbf{1}{t>1}$, for $\alpha=2$, we get a lower bound of order $n^{1/2}$ for the expectation of the max. It is also possible to reduce to the case where $X_i$ has a discrete distribution and takes the value $2^i$ with probability $p_i$. In this case, $E[\max{1\leq i\leq n}X_i]=\sum_{i=1}^\infty 2^i\left(c_i^n-c_{i-1}^n \right)$ where $c_i=\sum_{\ell=1}^ip_\ell$. – Davide Giraudo Dec 05 '24 at 15:42

1 Answers1

-1

Here are some thoughts. First, if $(Y_i)$ is a collection of identically distributed and non-negative random variables, then $\mathbb E[\max_{1\leqslant i\leqslant n}Y_i]\leqslant \lVert \max_{1\leqslant i\leqslant n}Y_i\rVert_p\leqslant n^{1/p}\lVert Y_1\rVert_p$. Doing a truncation argument, one can see that if $X_1\in\mathbb L^p$, then $n^{-1/p}\mathbb E\left[\max_{1\leqslant i\leqslant n}\lvert X_i\rvert\right]\to 0$. Therefore, if we want the rates that are mentioned in the opening post, that is, that for each $\alpha\in (0,1/2)$, there exists $c_\alpha$ such that $c\geqslant c_\alpha n^{1/2-\alpha}$, then necessarily, we cannot have $X_1\in\mathbb L^p$ for some $p>2$.

An other observation is that if $X_1$ has density $t^{-3}\mathbb{1}_{t>1}$ (for which the square integrability of $X_1$ is barely violated), then $\max_{1\leqslant i\leqslant n}X_i$ has density $nt^{-3}\mathbb{1}_{t>1} \left(1-t^{-2}\right)$. As a consequence, after doing the substitution $u=t^{-2}$, we find that $$ \mathbb E\left[\max_{1\leqslant i\leqslant n}\lvert X_i\rvert\right]=n\int_0^1u^{-1/2}(1-u)^{n-1}du=\operatorname{Beta}(1/2,n)=n\frac{\Gamma(1/2)\Gamma(n)}{\Gamma(n+1/2)}, $$ which behaves as $n^{1/2}$.

A last observation is that we can reduce to the case where $X_1$ take values that are dyadic numbers: indeed, we first reduce to the case where the $X_i$ are non-negative and we define $Y_i:=\sum_{k=1}^\infty 2^k\mathbf{1}_{2^k\leqslant X_i<2^{k+1}}$. Then $Y_i\leqslant X_i\leqslant 2Y_i$. Letting $p_k:=\mathbb P(Y_1=2^k)$, one can compute the law of $\max_{1\leqslant i\leqslant n}Y_i$ in the following way: $\mathbb P(\max_{1\leqslant i\leqslant n}Y_i=2^\ell)=\mathbb P(\max_{1\leqslant i\leqslant n}Y_i\leqslant 2^\ell)-\mathbb P(\max_{1\leqslant i\leqslant n}Y_i\leqslant 2^{\ell-1})=\left(\sum_{k=1}^\ell p_k\right)^n-\left(\sum_{k=1}^{\ell-1} p_k\right)^n$ hence $$ \mathbb E\left[\max_{1\leqslant i\leqslant n}Y_i\right]=\sum_{\ell=1}^\infty 2^\ell \left(\left(\sum_{k=1}^\ell p_k\right)^n-\left(\sum_{k=1}^{\ell-1} p_k\right)^n\right). $$ Letting $s_\ell:=\sum_{k=1}^\ell p_k$, this can be rewritten as $$ \mathbb E\left[\max_{1\leqslant i\leqslant n}Y_i\right]=\sum_{\ell=1}^\infty 2^\ell \int_{s_{\ell-1}}^{s_\ell}nt^{n-1}dt. $$ If $t\in [s_{\ell-1},s_\ell]$, then $2^{\ell}$ behaves as $Q_{Y_1}(1-t)$, where $Q_{Y_1}(u)=\inf\{ s, \mathbb P(Y_1>s)\leqslant u)$. Therefore
$$ \mathbb E\left[\max_{1\leqslant i\leqslant n}Y_i\right]\geqslant 2^{-1} \int_{0}^{1}nQ_{Y_1}(1-t)t^{n-1}dt. $$ The assumption we have at our disposal is that $\int_0^1Q_{Y_1}^2(u)du=+\infty$. Maybe some reversed Hölder's inequality can help.

Davide Giraudo
  • 181,608