12

Let $(X_n)$ be a sequence of independent random variables with distribution $$\mathbb{P}(X_n = -n) = \mathbb{P}(X_n = n) = \frac{1}{2} p_n,\mathbb{P}(X_n = 0) = 1 - p_n$$ where $0 \leq p_n \leq 1$. Prove that $\dfrac{S_n}{n}$ converges in probability to $0$ if and only if the series $\sum_{n=1}^{\infty} \dfrac{\mathbb{E} [X_n^2]}{n^2}$ is finite.
($S_n=X_1+X_2+...+X_n$)

I have done the $\Leftarrow$ part:

Since $\sum_{n=1}^{\infty}\dfrac{\operatorname{Var} [X_n]}{n^2}=\sum_{n=1}^{\infty}p_n=\sum_{i=1}^{\infty}\dfrac{\mathbb{E} [X_n^2]}{n^2}<\infty$, and $\{X_n\}$ is a sequence of independent random variables, $(a_n=n)$ comes to infinity$\Rightarrow\dfrac{S_n}{n}$ converges almost surely to $0$ (Strong law of large numbers), hence converges in probability to $0$.
I'm stuck at the $\Rightarrow$ part, i.e we have $\dfrac{S_n}{n}\rightarrow 0$ in probability and we need to show that $\dfrac{\mathbb{E} [X_n^2]}{n^2}<\infty$.
Could someone help me? Thanks in advance.

Davide Giraudo
  • 181,608
Alex Nguyen
  • 615
  • 1
  • 8
  • 1
    By definition, it looks like $E(X_n^2) = n^2 p_n/2 +n^2 p_n/2 + 0 (1-p_n) = n^2 p_n \implies \frac{E(X_n^2)}{n^2} = p_n$. So you need to show $p_n < \infty$. –  Oct 19 '24 at 09:50
  • @Balajisb yes, that what I wrote in my attempt – Alex Nguyen Oct 19 '24 at 09:51
  • 3
    If convergence in probability was replaced with almost sure convergence, then the BC lemma would imply (if the sum is infinite) that $\frac{X_{n}}{n}=1$ for infinitely many $n$ and $-1$ for infinitely many $n$. And it is necessary that $\frac{X_{n}}{n}\to 0$ if $S_{n}/n\to 0$ as $X_{n}/n=S_{n}/n-S_{n-1}/n$ – Mr. Gandalf Sauron Oct 19 '24 at 10:17
  • Try using: $E\left(\sum_{n} \frac{X_n^2}{n^2}\right) = E\left(\sum_{n} \frac{(S_n - S_{n-1})^2}{n^2} \right) $ –  Oct 19 '24 at 10:35
  • Do you assume anything on $p_k$? A wild guess would be that $|S_n|$ grows like $\sqrt{\sum_{k=1}^nk^2,p_k}$, but this need not be $\ge n$. – nejimban Oct 19 '24 at 21:12

2 Answers2

3

$\fbox{$\Longleftarrow$}\enspace$ (Without Kolmogorov's SLLN) Suppose $\sum_{n=1}^\infty p_n<\infty$. For every $\varepsilon>0$, \begin{align*} \Bbb P(|S_n|>n\varepsilon)&\le\frac{\operatorname{Var}(S_n)}{n^2\varepsilon^2}\\[.4em] &=\frac1{n^2\varepsilon^2}\sum_{k=1}^n\operatorname{\Bbb E}{X_k^2}\\[.4em] &=\frac1{\varepsilon^2}\cdot\frac1{n^2}\sum_{k=1}^nk^2\cdot p_k\\[.4em] &\xrightarrow[n\to\infty]{}0, \end{align*} by Kronecker's lemma.

$\fbox{$\Longrightarrow$}\enspace$ Suppose $\sum_{n=1}^\infty p_n=\infty$. We need to make the extra assumption that $$\sqrt{\sum_{k=1}^nk^2\,p_k}\ge c\,n,\tag{$\star$}$$ at least along a subsequence, where $c>0$ is some constant. (See otherwise below when this condition is not satisfied.)

  • On the one hand, thanks to the lower bound in Khinchine inequality for $p=1$, there exists $A_1>0$ such that $$A_1\left(\sum_{k=1}^n|X_k|^2\right)^{\frac12}\le\operatorname{\Bbb E}\left[\left|\sum_{k=1}^nX_k\right|\:\middle|\:|X_1|,\ldots,|X_n|\right].\tag{$\heartsuit$}$$ The (conditional) Jensen inequality then gives \begin{align*} \operatorname{\Bbb E}\Bigl[|S_n|^2\:\Big|\:|X_1|,\ldots,|X_n|\Bigr] &\ge\left(\operatorname{\Bbb E}\Bigl[|S_n|\mid|X_1|,\ldots,|X_n|\Bigr]\right)^2\\[.4em] &\ge A_1^2\sum_{k=1}^n|X_k|^2. \end{align*} Therefore, by taking expectations, $$\left(\operatorname{\Bbb E}{|S_n|}\right)^2\ge A_1^2\operatorname{\Bbb E}{\sum_{k=1}^n|X_k|^2}=A_1^2\sum_{k=1}^nk^2\,p_k.\tag{1}$$
  • On the other hand, $$\operatorname{\Bbb E}{|S_n|^2}=\sum_{k=1}^nk^2\,p_k.\tag{2}$$
  • Combining $(1)$ and $(2)$ with the Paley−Zygmund inequality, we obtain that $$\Bbb P\Bigl(|S_n|\ge\theta_n\operatorname{\Bbb E}{|S_n|}\Bigr)\ge(1-\theta_n)^2\frac{\left(\operatorname{\Bbb E}{|S_n|}\right)^2}{\operatorname{\Bbb E}{|S_n|^2}}\ge(1-\theta_n)^2A_1^2,$$ for every $0\le\theta_n\le1$. Choose $$\theta_n:=\frac{c\,n}{2\sqrt{\sum_{k=1}^nk^2\,p_k}}\in\left[0,\frac12\right].$$ Then, by $(1)$, $$\theta_n\operatorname{\Bbb E}{|S_n|}\ge\frac{cA_1\,n}2,$$ giving \begin{align*} \Bbb P\!\left(\frac{|S_n|}n\ge\frac{cA_1}2\right) &\ge\Bbb P\Bigl(|S_n|\ge\theta_n\operatorname{\Bbb E}{|S_n|}\Bigr) \\[.4em] &\ge\frac{A_1^2}4\\[.4em] &>0, \end{align*} along a subsequence. Hence $\frac1nS_n$ does not converge to $0$ in probability.

Counterexample when $(\star)$ is not fulfilled. Assume $p_n\sim\frac1{n\log n}$. Then $$\sum_{k=1}^nk^2\,p_k\lesssim\frac{n^2}{\log^2 n},$$ so $$\operatorname{\Bbb E}{\left|\frac{S_n}n\right|^2}=\frac1{n^2}\sum_{k=1}^nk^2\,p_k\lesssim\frac1{\log^2 n}.$$ In this case $\frac1nS_n$ converges to $0$ in $L^2$ (and in probability).

nejimban
  • 4,107
3

The following are equivalent

  1. $\left(\frac{S_n}n\right)$ converges to $0$ in probability,
  2. $\lim_{\ell\to\infty}\sum_{k=2^\ell}^{2^{\ell+1}}p_k=0$.
  3. $\lim_{n\to\infty}\frac 1{n^2}\sum_{k=1}^nk^2p_k=0.$

First, notice that 2. is equivalent to 3. Indeed, letting $a_n:=\frac 1{n^2}\sum_{k=1}^nk^2p_k$, we observe that if $2^N\leqslant n\leqslant 2^{N+1}$, then $a_{2^N}/4\leqslant a_n\leqslant 4a_{2^{N+1}}$ hence 3. is equivalent to $$ \tag{*}\lim_{N\to\infty}\frac 1{2^{2N}}\sum_{k=1}^{2^N}k^2p_k=0. $$ Suppose that 2. holds. Then $$ \frac 1{2^{2N}}\sum_{k=1}^{2^N}k^2p_k\leqslant\frac 1{2^{2N}}\sum_{\ell=0}^N \sum_{k=2^\ell+1}^{2^{\ell+1}}k^2p_k\leqslant \frac 4{2^{2N}}\sum_{\ell=0}^N2^{2\ell} \sum_{k=2^\ell+1}^{2^{\ell+1}} p_k $$ and we conclude that 3. holds using that if $\delta_\ell\to 0$, so does $2^{-2N}\sum_{\ell=1}^N 2^{2\ell}\delta_\ell$. Conversely, assume that 3. (hence (*)) holds. Since, $$ \frac 1{2^{2N}}\sum_{k=1}^{2^N}k^2p_k \geqslant \frac 1{2^{2N}}\sum_{k=2^{N-1}}^{2^N}k^2p_k\geqslant\frac 14 \sum_{k=2^{N-1}}^{2^N} p_k $$ we get 2..

Assume that 3. holds. By computing $\mathbb E\left[S_n^2\right]$, we find, by independence and the fact that the $X_j$ are centered, that $$ \mathbb E\left[\left(\frac{S_n}n\right)^2\right]=\frac 1{n^2}\sum_{k=1}^nk^2p_k $$ We deduce from condition 3. that $S_n/n\to 0$ in $L^2$ hence in probability.

Assume that $S_n/n\to 0$ in probability. By Levy inequality, independence and symmetry of $X_j$, we derive that $N^{-1}\max_{1\leqslant n\leqslant N} \lvert S_n\rvert\to 0$ in probability. Therefore, denoting $S_0=0$, $$\frac1N\max_{1\leqslant n\leqslant N}\lvert X_n\vert=\frac1N\max_{1\leqslant n\leqslant N}\lvert S_n-S_{n-1}\vert\leqslant \frac 2{N}\max_{1\leqslant n\leqslant N} \lvert S_n\rvert$$ hence $N^{-1}\max_{1\leqslant n\leqslant N} \lvert X_n\rvert\to 0$ in probability. In particular, we get that $$ \frac 1{2^{\ell+1}}\max_{2^\ell+1\leqslant k\leqslant 2^{\ell+1}}\lvert X_k\rvert\to 0\mbox{ in probability}. $$ Let $A_k:=\{X_k=k\}\cup\{X_k=-k\}$. Since for $2^\ell+1\leqslant k\leqslant 2^\ell$, $\lvert X_k\rvert\geqslant k\mathbf{1}_{A_k}\geqslant 2^\ell\mathbf{1}_{A_k} $, we infer that $$ \lim_{\ell\to\infty}\mathbb P\left(\bigcup_{k=2^\ell+1}^{2^{\ell+1}}A_k\right)=0. $$ By independence between the events $(A_k)$, we get that $$ \lim_{\ell\to\infty}\prod_{k=2^\ell+1}^{2^{\ell+1}}(1-p_k)=1 $$ Using $1-t\leqslant e^{-t}$, we get that $$1\leqslant \limsup_{\ell\to\infty}\prod_{k=2^\ell+1}^{2^{\ell+1}}e^{-p_k}\leqslant 1 $$ hence 2. is satisfied.

Also, the following are equivalent

  1. $(S_n/n)$ converges to $0$ almost surely.
  2. $\sum_\ell p_\ell<\infty$.

Indeed, suppose that 2. holds. Using Chebychev's then Doob's inequality, we find that $$ \mathbb P\left(\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{k=1}^n X_k\right\rvert>2^{N}\varepsilon\right)\leqslant \frac 1{2^{2N}\varepsilon^2}\mathbb E\left[\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{k=1}^n X_k\right\rvert^2\right]\leqslant \frac 4{2^{2N}\varepsilon^2}\sum_{k=1}^{2^N}k^2p_k, $$ which shows, after having summed over $N$ and switched the sums, that $2^{-N}\max_{1\leqslant n\leqslant 2^N}\left\lvert \sum_{k=1}^n X_k\right\rvert\to 0$ almost surely.

Conversely, if $S_n/n\to 0$ almost surely, then $X_n/n\to 0$ almost surely and we conclude by the second Borel-Cantelli's lemma that 2. holds.

Davide Giraudo
  • 181,608