2

I am self-learning introductory stochastic calculus from A first course in Stochastic Calculus by L.P.Arguin.

The part(c) of the below exercise problem on the time-inversion property of Brownian motion asks to derive the law of large numbers using the previously developed results. I struggled to write a proof of this.

I would like to ask, if my upper bounds and convergence for part (c) make sense, and is technically correct and rigorous.

I reproduce my solution to parts (a) and (b) for completeness.

Time Inversion. Let $(B_{t},t\geq0)$ be a standard brownian motion. We consider the process:

\begin{align*} X_{t} & =tB_{1/t}\quad\text{for }t>0 \end{align*}

This property relates the behavior of $t$ large to the behavior of $t$ small.

(a) Show that $(X_{t},t>0)$ has the distribution of Brownian motion on $t>0$.

Proof.

Like $B(t)$, it is easy to show that $X(t)$ is also a Gaussian process.

Also, $\mathbb{E}[X_{s}]=0$.

Let $s<t$. We have:

\begin{align*} Cov(X_{s},X_{t}) & =\mathbb{E}[sB(1/s)\cdot tB(1/t)]\\ & =st\mathbb{E}[B(1/s)\cdot B(1/t)]\\ & =st\cdot\frac{1}{t}\\ & \quad\left\{ \because\frac{1}{t}<\frac{1}{s}\right\} \\ & =s \end{align*}

Consequently, $X(t)$ has the distribution of a Brownian motion.

(b) Argue that $X(t)$ converges to $0$ as $t\to0$ in the sense of $L^{2}$-convergence. It is possible to show convergence almost surely so that $(X_{t},t\geq0)$ is really a Brownian motion for $t\geq0$.

Solution.

Let $(t_{n})$ be any arbitrary sequence of positive real numbers approaching $0$ and consider the sequence of random variables $(X(t_{n}))_{n=1}^{\infty}$. We have:

\begin{align*} \mathbb{E}\left[X(t_{n})^{2}\right] & =\mathbb{E}\left[t_{n}^{2}B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\mathbb{E}\left[B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\cdot\frac{1}{t_{n}}\\ & =t_{n} \end{align*}

Hence,

\begin{align*} \lim\mathbb{E}\left[X(t_{n})^{2}\right] & =\lim t_{n}=0 \end{align*}

Since $(t_{n})$ was an arbitrary sequence, it follows that $\lim_{t\to0}\mathbb{E}[(X(t))^{2}]=0$.

(c) Use this property of Brownian motion to show the law of large numbers for Brownian motion: \begin{align*} \lim_{t\to\infty}\frac{X(t)}{t} & =0\quad\text{almost surely} \end{align*}

Proof Sketch.

Let $(t_n)$ be an arbitrary sequence such that $(t_n)\to \infty$. Thus, $\forall n \in \mathbf{N}$, $\exists t_{k_n}$, such that $t_{k_n} > n$.

Consider the sequence of random variables $X_n := X(t_n)$. Let $\epsilon$ be arbitrary. We have:

\begin{align*} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right) &= \mathbf{P}\left[\left(\frac{X(t_n)}{t_n}\right)^4>\epsilon^4\right]\\ &= \mathbf{P}[X(t_n)^4 > t_n^4 \epsilon^4]\\ &\leq \frac{1}{t_n^4 \epsilon^4} \mathbf{E}[X(t_n)^4]\\ & \quad \left\{ \text{ Chebyshev's inequality }\right\} \\ &= \frac{1}{t_n^4 \epsilon^4} \cdot 3t_n^2 \\ & \quad \left\{ \text{ Fourth moment of a standard brownian motion }\right\} \\ &= \frac{3}{\epsilon^4} \cdot \frac{1}{t_n^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{t_{k_n}^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{n^2} \\ \end{align*}

Since $\sum \frac{1}{n^2}$ is a convergent series, by the comparison test $\sum_{n=1}^{\infty} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right)$ converges.

We know that, if $(\forall \epsilon>0)$, $\sum_{n=1}^{\infty} \mathbf{P}(|X_n - X| > \epsilon) < \infty$, then $X_n \to X$ almost surely.

Consequently, $\lim_{t_n \to \infty} \frac{X(t_n)}{t_n} = 0$ almost surely. Since, $(t_n)$ was an arbitrary sequence, $\lim_{t \to \infty} \frac{X(t)}{t} = 0$ almost surely.

FD_bfa
  • 4,757
Quasar
  • 5,644
  • 1
    The proof of (c) seems good to me – jd27 Jun 04 '23 at 07:11
  • 1
    Wait a minute. What does proving $\frac{X_{t}}{t}$ tends to $0$ as $t\to \infty$ achieve? It just shows that $B_{1/t}\to 0$ as $t\to\infty$ almost surely. which is an obvious fact due to continuity of Brownian motion. @jd27 please read through the whole thing before making a comment. – Mr. Gandalf Sauron Jun 04 '23 at 08:36
  • @Mr.GandalfSauron, isn't $X(t)$ itself a Brownian motion (different from the original brownian motion)? Do you think, the proof is incorrect? – Quasar Jun 04 '23 at 08:42
  • @Quasar That has to be proven. The continuity at $0$ is the step which you have not shown. – Mr. Gandalf Sauron Jun 04 '23 at 08:43
  • @Quasar You have shown that in $L^2$ sense and hence in probability that $X(t)\to 0$ as $t\to 0$ . That does not yield you continuity at $0$ almost surely. That is the main step of the proof and THAT is the "Strong" Law of Large Number for Brownian motion. – Mr. Gandalf Sauron Jun 04 '23 at 08:47
  • 1
    And at the end, proving $\frac{X(t)}{t}\to 0$ as $t\to\infty$ is obvious and follows directly from continuity of $B$ . That is, you are only proving that $B_{1/t}\to 0$ as $t\to\infty$ which is trivial from the definition(continuity) of SBM. – Mr. Gandalf Sauron Jun 04 '23 at 08:48
  • @Mr.GandalfSauron in the post OP says that his goal is to prove $\lim_{t \to \infty} X(t) /t = 0$ a.s. as part (c) of the exercise. Where i assume $X(t)$ is defined as before, which is what OP did. Now if OP actually wants to prove something else then that is a different story, but then maybe the question should be edited. – jd27 Jun 04 '23 at 09:00

1 Answers1

4

What you need to do is show that $X_{t}\to 0$ as $t\to 0$ almost surely . That would show that $\frac{B_{1/t}}{1/t}\to 0$ as $t\to 0$ almost surely which is same as showing $\frac{B_{t}}{t}\to 0$ as $t\to\infty$ which is the law of large numbers for brownian motion.

What you have done is show that $E(X(t)^{2})\to 0$ as $t\to 0$ which shows convergence in the $L^2$ sense and hence convergence in Probability. That is infact the Weak Law of Large Numbers. That is $\frac{B_{t}}{t}\xrightarrow{P} 0$ as $t\to\infty$.

To show almost sure convergence, you have to argue that $X(t)\to 0$ as $t\to 0$ which cannot be done by just using the L^2 convergence by using subsequential arguments. That is a separate proof itself which is done to show that $X_{t}$ is indeed a Brownian Motion. To do that you do have to show continuity of sample paths almost surely.

For $t>0$ , continuity is clear. However, it is the proof that as $t\to 0$, you have $X(t)\to 0$ almost surely is the main step in the proof which you have not done. You have just shown convergence in $L^2$ and hence in probability which is NOT equivalent to convergence almost surely.

I am posting a proof, which I like the most from Rene Schillings Brownian Motion book.

Note that $X(t)\to 0$ as $t\to 0$ if and only if for all $n\geq 1$ , there exists $m\geq 1$ such that for all $r\in \mathbb{Q}\cap (0,\frac{1}{m}]$ , you have $X(r)=|rB(1/r)|\to \frac{1}{n}$.

(To understand the above, recall the $\epsilon-\delta$ definition of continuity. Note that $\frac{1}{n}$ works as $\epsilon$ and $\frac{1}{m}$ works as $\delta$).

That is, $$\{X(t)\to 0\} = \cap_{n\geq 1}\cup_{m\geq 1}\cap_{r\in\mathbb{Q}\cap (0,\frac{1}{m}]}\{X(r)\leq\frac{1}{n}\}$$

But the RHS will have the same probability as $\cap_{n\geq 1}\cup_{m\geq 1}\cap_{r\in\mathbb{Q}\cap (0,\frac{1}{m}]}\{W(t)\leq\frac{1}{n}\}$ where $W$ denotes a standard brownian motion. This is because $X_{t}$ and $W_{t}$ have the same law, i.e. the same finite dimensional distributions and the union/intersections on the RHS was all countable.

Thus $P(\{X(t)\to 0\,,t\to 0\})=P(\{W(t)\to 0\,,t\to 0\}) = 1$ and thats it.

This actually shows that $X(t)$ is a bonafide Standard Brownian Motion as we have established continuity as well.

There are many other ways of proving this. For example see here . A standard proof goes by the Doob's Maximal inequality. See this note

Actually, time inversion and law of large numbers are the two sides of the same coin. One implies and is implied by the other.

  • Why are we happy with ${X(r) \leq \frac{1}{n}}$ holding only for the rationals in $(0,\frac{1}{m}]$? I understand, we want to work with countable unions/intersections.

    But, for instance, when it comes to functions like $\mathbf{1}_{\mathbb{Q}}(x)$, we have, $(\forall \epsilon>0)$, $\forall r \in \mathbb{Q} \cap (0,\frac{1}{m}]$, $|f(r)-f(0)|<\epsilon$, but $f$ is not continuous at $0$.

    – Quasar Jun 04 '23 at 09:46
  • That is just due to the density of rational numbers and the fact that $X(t)$ is continuous for $t>0$ and for $t\geq a>0$ is a uniformly continuous function for each $a>0$ . – Mr. Gandalf Sauron Jun 04 '23 at 09:48
  • 1
    To elaborate further, a function defined on a dense subset such that it is uniformly continuous on all compact subsets of $\Bbb{R}$ , extends uniquely to a continuous function. That is precisely what I have done above. – Mr. Gandalf Sauron Jun 04 '23 at 09:56