0

The Problem Given $(B_s,s \in [0,t])$ a Brownian motion on $[0,t]$ and $0=t_0 < t_1 < ... < t_n \le t$ a partition of $[0,t]$ such that $\max_{j}|t_{j+1} - t_j| \to 0$ as $n \to \infty$. Show that the random variable $$\sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2$$ converges to $t$ in $L^2$ as $n \to \infty$.

I am taking this class in stochastic calculus and I am very dry on probability theory. However, I think I understand key concepts of Brownian Motion, so I need some help putting together the pieces. I know I want to show that $$\lim_{n \to \infty} \left\Vert \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2- t \right\Vert_2 = \lim_{n \to \infty} \mathbb{E}\left[ \left| \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2- t \right|^2 \right]=0$$

I also realize that $$\mathbb{E} \left[ \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2 \right ]= \sum_{j=0}^n \mathbb{E} \left[ (B_{t_j} - B_{t_{j-1}})^2 \right ]$$ $$=\sum_{j=0}^n t_j - t_{j-1}=t$$ by the property of the variance of Brownian motion and because the sum of the $t_i$ is $t$. I guess by biggest problem is I don't know how to mathematically break into this $L^2$ norm. Any help is greatly appreciated.

1 Answers1

1

Hint

In your partition, you took $t_n\leq t$, but you must take $t_n=t$. Then,

\begin{align*} &\mathbb E\left[\left(\sum_{j=0}^{n-1}(B_{t_{j+1}}-B_{t_j})^2-t\right)^2\right]\\ &=\mathbb E\left[\left(\sum_{j=1}^{n-1}(B_{t_{j+1}}-B_{t_j})^2-(t_{j+1}-t_j)\right)^2\right]\\ &=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}\mathbb E\left[\Big((B_{t_{j+1}}-B_{t_j})^2-(t_{j+1}-t_j)\Big)\Big((B_{t_{i+1}}-B_{t_i})^2-(t_{i+1}-t_i)\Big)\right]. \end{align*}

I let you continue.

Surb
  • 57,262
  • 11
  • 68
  • 119
  • OK that's great, I was a little surprised to see those summation symbols come outside the expectation, but I see what you did there. – jeffery_the_wind Feb 14 '21 at 15:58
  • I feel like now the next step is to break this expression into 2 cases: when $i=j$ and when $i \ne j$. If $i \ne j$ then we have independent intervals of Brownian motion, so we can say the expectation of the product is the product of the expectation, and for those we easily get zero. For the case where $i=j$ we can't assume independence, we would have $E[(B_{t_{j+1}} - B_{t_j})^4 - 2(t_{j+1} - t_j)(B_{t_{j+1}} - B_{t_j})^2 + (t_{j+1} - t_j)^2] = E[(B_{t_{j+1}} - B_{t_j})^4] - (t_{j+1} - t_j)^2$. This looks very promising but I don't know about the fourth moment?! – jeffery_the_wind Feb 14 '21 at 16:08
  • @jeffery_the_wind: $B_{t_{j+1}}-B_{t_j}\sim \mathcal N(0,t_{j+1}-t_j)$. – Surb Feb 14 '21 at 16:17
  • Yes I understand that, but I'm having a hard time understanding how to deal with the fourth power there. I have seen results stating the fourth moment of a gaussian is $3 \sigma^4$, which doesn't help. https://math.stackexchange.com/questions/1917647/proving-ex4-3%CF%834 – jeffery_the_wind Feb 14 '21 at 16:40
  • @jeffery_the_wind: Yes, that's very good. Then, $$\mathbb E[(B_{t_{j+1}}-B_{t_j})^4]=3(t_{j+1}-t_j)^2.$$ – Surb Feb 14 '21 at 16:42
  • Yes so then we still end up with $2(t_{j+1} - t_j)^2$ inside the sum, which isn't zero... – jeffery_the_wind Feb 14 '21 at 20:04
  • 2
    @jeffery_the_wind: Of course it won't be $0$. Nevertheless $$\sum_{j=0}^{n-1}(t_{j+1}-t_j)^2\leq t\cdot \max_{j=0,...,n-1}|t_{j+1}-t_j|\underset{n\to \infty }{\longrightarrow }0.$$ – Surb Feb 14 '21 at 20:05
  • OK I was thinking that I needed to use that property given in the question somehow. This is the style of thing I am lacking right now. It is really interesting how this can converge to zero even though intuitively it seems we are always taking a sum of non zero values. Just what is the property there that allows you to write the inequality in that last step? – jeffery_the_wind Feb 14 '21 at 20:17
  • 1
    No specific property... just the fact that $|t_{i+1}-t_i|\leq \max_{j=0,...,n-1}|t_{j+1}-t_j|$ for all $i$. – Surb Feb 14 '21 at 20:20
  • 1
    Thank you so much! I got it! – jeffery_the_wind Feb 14 '21 at 23:49