0

I was trying to understand why almost sure convergence doesn't imply convergence of the mean and I encountered this answer.

However, I do not understand why this sequence of random variables converges to $0$ almost surely. How is this example different from that? For me, both these sequences are working the same way: the probability of them being $0$ is increasing with $n$. So why one of them converges to $0$ almost surely, and the other doesn't?

bg5
  • 139
  • If you are referring, for the second, to Davide Giraudo's anwer: because the probability to be non-zero $0$ decreases much faster ($1/n^2$) in the first case than in the second ($1/n$). At a very hazy level (that e.g. Borel—Cantelli helps to make precise), we have $\sum_n \frac{1}{n^2} < \infty$, while $\sum_n \frac{1}{n} = \infty$. – Clement C. Jun 11 '16 at 18:43
  • Also, it would help if you could include the actual sequences of r.v. in your question, to make it self-contained. – Clement C. Jun 11 '16 at 18:47
  • Thank you for your help. As for quoting those answers: I didn't know how to include authors' names, so I decided not to do this at all. Sorry if I that made my question harder to read. – bg5 Jun 11 '16 at 19:05

0 Answers0