2

Let $\{X_n\}_{n \geq 1}$ be a sequence of random variables with $\mathbb{E}[X_n] = u$. Suppose $\lim_{n \to \infty}\mathrm{Var}[X_n] = 0$. Do we have that $X_n$ converges to constant $u$ almost surely?


What I ask actually comes from proving the quadratic variation of Brownian motion $B(t)$ is $t$. I was wondering how above argument for $X_n = \sum_i[B(t_i^n)-B(t_{i-1}^n)]^2$ implies that quadratic variation of Brownian motion $B(t)$ is $t$?

zxzx179
  • 1,577
  • As far as I know, the approximate quadratic variation need not converge a.s. without extra conditions. Refer to this, for instance. I am not sure how you would relate those two problems. – Sangchul Lee Feb 23 '17 at 22:42
  • 1
    It's a theorem in my textbook. Refer to this https://books.google.com/books?id=JYzW0uqQxB0C&lpg=PA63&vq=quadratic%20variance&pg=PA63#v=snippet&q=quadratic%20variance&f=false Page 63. – zxzx179 Feb 23 '17 at 22:48

1 Answers1

2

Yes, and the easiest way to prove this is Chebyshev's Inequality.

Edit: Good point, but you can use Borel-Cantelli if the variances are summable. I think with your example we already know that there is an a.s. limit for the quadratic variation, so we're just making claims about what it is, I'm blanking on it now but that's likely the argument.

Pepe Silvia
  • 1,704