3

Consider the problem presented in this post: An example: The average of a sequence of random variables does not converge to the mean of each variable in the sequence almost surely, where it is determined that the sequence of averages $(S_n)_{n=2}^\infty, S_n = \frac{1}{n - 1}\sum_{m=2}^nX_m$, $P(X_m = m) = P(X_m = -m) = \frac{1}{2m\log(m)}, P(X_m = 0) = 1 - \frac{1}{m\log(m)}$ does not converge almost surely to $0$. But what do we know about the general behaviour of the said sequence $(S_n)_{=2}^\infty$? How could we determine whether or not the sequence converges almost surely to any constant $c \in \mathbb{R}$?

Epsilon Away
  • 1,070
  • 1
    Related (not intended to answer this question): https://stats.stackexchange.com/questions/563733/why-should-the-frequency-of-heads-in-a-coin-toss-converge-to-anything-at-all – Peter O. Feb 13 '22 at 17:51

1 Answers1

1

Suppose $S_n$ converges to a constant $c \ne 0$? Using Chebychev's Inequality: $$ \Pr(|S_n| > \lambda) \le \lambda^{-2} \mathbb E |S_n|^2 .$$ Now $$ \mathbb E|S_n|^2 = \frac1{(n-1)^2} \sum_{m=2}^n \frac{m^2}{m \log(m)} ,$$ which is bounded by $C/\log(n)$ for some constant $C>0$ if $n \ge 2$, because \begin{align} \sum_{m=2}^n \frac{m^2}{m \log(m)} &\le \sum_{m=2}^{\lfloor \sqrt n \rceil} \frac{m}{\log(m)} + \sum_{m=\lceil \sqrt n \rceil}^n \frac{m}{\log(m)} \\ &\le \sum_{m=2}^{\lfloor \sqrt n \rceil} m + \sum_{m=\lceil \sqrt n \rceil}^n \frac{m}{\log(\sqrt n)} \\ &\le \sum_{m=2}^{\lfloor \sqrt n \rceil} \sqrt n + \sum_{m=1}^n \frac{n}{\log(\sqrt n)} \\ & \le n + 2 \frac{n^2}{\log(n)} \\ & \le \frac{n^2}{\log(n)} + 2 \frac{n^2}{\log(n)} \\ &\le 12 \frac{(n-1)^2}{\log(n)} , \end{align} since $\log(n) \le n$, and $n^2 \le 4(n-1)^2$.

So $\Pr(|S_n| > |c|/2) \to 0 $ as $n \to \infty$.

  • Why is the variance of $S_n$ bounded by $\frac{C}{\log(n)}$ for some constant $C$? – Epsilon Away Feb 16 '22 at 07:51
  • 1
    It is explained in the series of inequalities that follows the colon. – Stephen Montgomery-Smith Feb 16 '22 at 16:04
  • Mongomery-Smith Ah, pardon me for not getting it before. What I'm still trying to understand is that why the upper bound of the Chebychev's inequality approaches zero? Namely, you've shown that $\mathbb{E}(S_n^2) \leq C\frac{(n - 1)^2}{\log(n)}$. Then, if we use the $C$ as a lower bound for $|S_n| > |C|/2$, then isn't the upper bound of the probability equal to $C\frac{(n - 1)^2}{\log(n)}\cdot \frac{4}{C^2} = \frac{4(n - 1)^2}{\log(n)C} \to +\infty, n \to +\infty$? – Epsilon Away Feb 16 '22 at 16:39
  • 1
    The formula for $E|S_n|^2$ has an extra $1/(n-1)^2$ which is not in the sequence of inequalities. – Stephen Montgomery-Smith Feb 16 '22 at 17:05
  • Ah, good catch. The one last thing I'd like to confirm is that it is okay, in this case, that the constant $C$ depends on $n$, because we are considering the limit of the probability inequality, in which each term of the sequence is bounded by the $\frac{4}{\log(n)C(n)}$? I say that $C$ depends on $n$, as you've estimated $n +2 \frac{n^2}{\log(n)} \leq C\frac{(n - 1)^2}{\log(n)}$. – Epsilon Away Feb 16 '22 at 17:10
  • Since it is an upper bound, the quantity $C$ can be chosen to be independent of $n$. – Stephen Montgomery-Smith Feb 16 '22 at 17:26
  • 1
    I showed that $C = 12$ will work if $n \ge 2$. – Stephen Montgomery-Smith Feb 16 '22 at 17:54
  • Hmm, it occurred to me after some thinking, and how do we now know that $\sum_{n=2}^\infty \mathbb{P}\left(|S_n| > |C|/2\right) < \infty$? We certainly know that the sum converges in probability, but doesn't almost sure convergence require that the sum of the probabilities of the complementary event is finite? – Epsilon Away Feb 17 '22 at 08:55
  • You don't need that. Convergence almost everywhere implies convergence in probability (Egorov's Theorem). It's the converse that isn't true. – Stephen Montgomery-Smith Feb 17 '22 at 17:24