I am self-learning introductory stochastic calculus from A first course in Stochastic Calculus by L.P.Arguin.
The part(c) of the below exercise problem on the time-inversion property of Brownian motion asks to derive the law of large numbers using the previously developed results. I struggled to write a proof of this.
I would like to ask, if my upper bounds and convergence for part (c) make sense, and is technically correct and rigorous.
I reproduce my solution to parts (a) and (b) for completeness.
Time Inversion. Let $(B_{t},t\geq0)$ be a standard brownian motion. We consider the process:
\begin{align*} X_{t} & =tB_{1/t}\quad\text{for }t>0 \end{align*}
This property relates the behavior of $t$ large to the behavior of $t$ small.
(a) Show that $(X_{t},t>0)$ has the distribution of Brownian motion on $t>0$.
Proof.
Like $B(t)$, it is easy to show that $X(t)$ is also a Gaussian process.
Also, $\mathbb{E}[X_{s}]=0$.
Let $s<t$. We have:
\begin{align*} Cov(X_{s},X_{t}) & =\mathbb{E}[sB(1/s)\cdot tB(1/t)]\\ & =st\mathbb{E}[B(1/s)\cdot B(1/t)]\\ & =st\cdot\frac{1}{t}\\ & \quad\left\{ \because\frac{1}{t}<\frac{1}{s}\right\} \\ & =s \end{align*}
Consequently, $X(t)$ has the distribution of a Brownian motion.
(b) Argue that $X(t)$ converges to $0$ as $t\to0$ in the sense of $L^{2}$-convergence. It is possible to show convergence almost surely so that $(X_{t},t\geq0)$ is really a Brownian motion for $t\geq0$.
Solution.
Let $(t_{n})$ be any arbitrary sequence of positive real numbers approaching $0$ and consider the sequence of random variables $(X(t_{n}))_{n=1}^{\infty}$. We have:
\begin{align*} \mathbb{E}\left[X(t_{n})^{2}\right] & =\mathbb{E}\left[t_{n}^{2}B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\mathbb{E}\left[B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\cdot\frac{1}{t_{n}}\\ & =t_{n} \end{align*}
Hence,
\begin{align*} \lim\mathbb{E}\left[X(t_{n})^{2}\right] & =\lim t_{n}=0 \end{align*}
Since $(t_{n})$ was an arbitrary sequence, it follows that $\lim_{t\to0}\mathbb{E}[(X(t))^{2}]=0$.
(c) Use this property of Brownian motion to show the law of large numbers for Brownian motion: \begin{align*} \lim_{t\to\infty}\frac{X(t)}{t} & =0\quad\text{almost surely} \end{align*}
Proof Sketch.
Let $(t_n)$ be an arbitrary sequence such that $(t_n)\to \infty$. Thus, $\forall n \in \mathbf{N}$, $\exists t_{k_n}$, such that $t_{k_n} > n$.
Consider the sequence of random variables $X_n := X(t_n)$. Let $\epsilon$ be arbitrary. We have:
\begin{align*} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right) &= \mathbf{P}\left[\left(\frac{X(t_n)}{t_n}\right)^4>\epsilon^4\right]\\ &= \mathbf{P}[X(t_n)^4 > t_n^4 \epsilon^4]\\ &\leq \frac{1}{t_n^4 \epsilon^4} \mathbf{E}[X(t_n)^4]\\ & \quad \left\{ \text{ Chebyshev's inequality }\right\} \\ &= \frac{1}{t_n^4 \epsilon^4} \cdot 3t_n^2 \\ & \quad \left\{ \text{ Fourth moment of a standard brownian motion }\right\} \\ &= \frac{3}{\epsilon^4} \cdot \frac{1}{t_n^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{t_{k_n}^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{n^2} \\ \end{align*}
Since $\sum \frac{1}{n^2}$ is a convergent series, by the comparison test $\sum_{n=1}^{\infty} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right)$ converges.
We know that, if $(\forall \epsilon>0)$, $\sum_{n=1}^{\infty} \mathbf{P}(|X_n - X| > \epsilon) < \infty$, then $X_n \to X$ almost surely.
Consequently, $\lim_{t_n \to \infty} \frac{X(t_n)}{t_n} = 0$ almost surely. Since, $(t_n)$ was an arbitrary sequence, $\lim_{t \to \infty} \frac{X(t)}{t} = 0$ almost surely.