26

How do you show, that for every bound $\epsilon$, there is a non-zero probability that the motion is bounded on a finite interval. i.e. $$\mathbb{P} (\sup_{t\in[0,1]} |B(t)| < \epsilon) > 0$$

I tried using the reflection principle, I can show that if B* is the motion reflected on the hitting time with the bound, $\tau$, then I think that $$P(\sup_{t\in[0,1]} |B(t)| > \epsilon) = P(|B(1)| > \epsilon) + P(|B*(1)| > \epsilon) - P(\tau < 1, |B(1)-B(\tau)| > 2\epsilon)$$ However, I have no idea how to bound the third term (in a useful way, which would show that this probability is non-one)

yaakov
  • 955
  • @Nate, thanks, I've deleted my answer. My original idea was to use the fact that $B_t$ is bounded, hence the supremum should be also. Initial hand-waving in my head was convincing, so I posted the answer, without checking it thoroughly. I apologise for that. – mpiktas May 12 '11 at 17:46

5 Answers5

34

Here's three different methods of showing that $\mathbb{P}(\sup_{t\in[0,1]}\vert B_t\vert < \epsilon)$ is nonzero.

A simple argument based on intuition. You can break the unit interval into a lot of small time steps and, by continuity, the Brownian motion will not move much across each of these steps. By independence of the increments, there is a positive (but small) probability that they largely cancel out, so $B$ stays within $\epsilon$ of the origin. To make this precise, choose a positive integer $n$ such that $q\equiv\mathbb{P}(\sup_{t\le1/n}\vert B_t\vert < \epsilon/2)$ is nonzero ($q$ can be made as close to 1 as you like, by taking $n$ large). By symmetry, the event $\{\sup_{t\le1/n}\vert B_t\vert < \epsilon/2,\ B_{1/n}>0\}$ has probability $q/2$. Note that, if $\sup_{t\in[k/n,(k+1)/n]}\vert B_t-B_{k/n}\vert < \epsilon/2$ and $B_{(k+1)/n}-B_{k/n}$ has the opposite sign to $B_{k/n}$ for each $k=0,1,\ldots,n-1$ then $\vert B_t\vert$ will be bounded by $\epsilon/2$ at the times $k/n$ and, therefore, $\sup_{t\le1}\vert B_t\vert$ will be less than $\epsilon$. So, $\mathbb{P}(\sup_{t\le1}\vert B_t\vert < \epsilon)\ge(q/2)^n$.

Use a cunning trick. If $X,Y$ are independent Brownian motions over the interval $[0,1]$, then $B=(X-Y)/\sqrt{2}$ is also a Brownian motion. The sample paths of $X,Y,B$ can be considered as lying in the (complete, separable) metric space $C([0,1])$ of continuous functions $[0,1]\to\mathbb{R}$ under the supremum norm. By separability, $C([0,1])$ can be covered by countably many open balls of radius $\epsilon/\sqrt{2}$. So, by countable additivity of the probability measure, there exists at least one such ball containing $X$ with probability $q > 0$. By independence, $X,Y$ are both contained in this ball with probability $q^2 > 0$, in which case $\Vert B\Vert_\infty=\Vert X- Y\Vert_\infty/\sqrt{2}<\epsilon$.

Exact calculation. You can calculate an exact expression for the probability, as an infinite sum, and verify that it is dominated by a single positive term as $\epsilon$ goes to zero. This is not as simple as the intuitive argument I gave above, but has the advantage that it also gives an accurate asymptotic expression for the probability, which goes to zero like $e^{-\pi^2/(8\epsilon^2)}$ as $\epsilon\to0$ (this is positive, but tends to zero very quickly).

The probability can be calculated using the reflection principal (also see my comments and Douglas Zare's answer to this question). Writing $p(x)=(2\pi)^{-1/2}e^{-x^2/2}$ for the probability density function of $B_1$ and $f(x)=\sum_{n=-\infty}^\infty(-1)^n1_{\{(2n-1)\epsilon < x < (2n+1)\epsilon\}}$ (which is a kind of square wave function), $$ \mathbb{P}\left(\sup_{t\le1}\vert B_t\vert < \epsilon\right)=\mathbb{E}[f(B_1)]=\int_{-\infty}^\infty f(x)p(x)\,dx.\qquad{\rm(1)} $$ This expression comes from the reflection principle, which says that reflecting $B$ after it first hits $\pm\epsilon$ gives another Brownian motion. That is, $\hat B_t\equiv B_t+1_{\{t\ge T\}}2(B_T-B_t)$ is a Brownian motion, where $T$ is the first time at which $\vert B_T\vert=\epsilon$. As $f$ is antisymmetric about both $\epsilon$ and $-\epsilon$, the sum $f(B_1)+f(\hat B_1)$ vanishes whenever $T\le1$. So, $1_{\{T > 1\}}=(f(B_1)+f(\hat B_1))/2$, and taking the expectation gives (1).

You can perform the integral in (1) directly to express the probability as an infinite sum over the cumulative normal distribution function, but this is not so good in the limit where $\epsilon$ is small, as you don't have a single dominant term. Alternatively, the integral in (1) can be written as $\int_{-\epsilon}^\epsilon\theta(x)\,dx$ where $\theta(x)=\sum_{n=-\infty}^\infty(-1)^np(x+2n\epsilon)$. As $\theta$ has period $4\epsilon$ you can write it as a Fourier series, and working out the coefficients gives $$ \theta(x)=\epsilon^{-1}\sum_{\substack{n > 0,\\n{\rm\ odd}}}\cos\left(\frac{n\pi x}{2\epsilon}\right)\exp\left(-\frac{n^2\pi^2}{8\epsilon^2}\right). $$ This is a very fast converging sum, especially for small $\epsilon$ (the terms vanish much faster then exponentially in $n$). Actually, $\theta$ is a theta function and the fourier transform is the same thing as the Jacobi identity. Integrating it term by term gives $$ \mathbb{P}\left(\sup_{t\le1}\vert B_t\vert < \epsilon\right)=\sum_{\substack{n > 0,\\ n{\rm\ odd}}}\frac{4}{n\pi}(-1)^{(n-1)/2}\exp\left(-\frac{n^2\pi^2}{8\epsilon^2}\right) $$ As the first term goes to zero much more slowly than the sum of the remaining terms (as $\epsilon\to0$) this gives the asymptotic expression $$ \mathbb{P}\left(\sup_{t\le1}\vert B_t\vert < \epsilon\right)\sim\frac{4}{\pi}\exp\left(-\frac{\pi^2}{8\epsilon^2}\right). $$

  • 2
    Thanks. This is exactly the kind of answer I was looking for. – yaakov May 14 '11 at 20:05
  • 1
    This is very nice! – Nate Eldredge May 15 '11 at 01:15
  • It may be instructive to consider more general Levy processes. – Shai Covo May 15 '11 at 06:45
  • btw, I added another argument based on a cunning trick which just occured to me. – George Lowther May 15 '11 at 08:40
  • @Shai: The first two arguments also apply to any symmetric Lévy process, although they could be extended to more general cases (the result does not hold for every Lévy process, of course). – George Lowther May 15 '11 at 08:42
  • I don't think infinite is really the correct condition (assuming you mean infinite support?). You could write out a list of conditions on the Lévy process for the result to hold (e.g., if there is a Brownian component, or if arbitrarily small jumps can occur in both directions, etc), but it seems a bit messy. – George Lowther May 15 '11 at 21:30
  • A reasonable necessary and sufficient condition for ${\rm P}(\sup _{t \in [0,1]} |X_t | < \varepsilon ) > 0$ $\forall \varepsilon > 0$, where $X$ is a L'evy process.

    Notation first: $\sigma^2 \geq 0$ is the variance parameter of the Brownian component, $\nu$ is the L'evy measure, $\kappa : = \int_{|x| \le 1} {|x|\nu (dx)} \le \infty$, and, in case $\kappa < \infty$, $\gamma_0$ is the drift, that is $X_t = \gamma_0 t + \sigma B_t + Y_t$, where $B$ is a standard BM independent of $Y$ which is a pure jump process (sum of jumps) of finite variation (possibly the zero process).

    – Shai Covo May 18 '11 at 12:56
  • Reasonable conjecture. ${\rm P}(\sup _{t \in [0,1]} |X_t | < \varepsilon ) > 0$ $\forall \varepsilon > 0$ if and only if one of the following conditions is satisfied: (continued next comment) – Shai Covo May 18 '11 at 12:57
  • $\sigma^2 > 0$ or $\kappa = \infty$ (equivalently, $X$ is a process of infinite variation); 2) $\sigma^2 = 0$, $\kappa < \infty$, and $\gamma_0 = 0$; 3) $\sigma^2 = 0$, $\kappa < \infty$, $\gamma_0 \neq 0$, and $X$ can have arbitrarily small jumps with sign opposite to that of $\gamma_0$ (for example, if $\gamma_0 < 0$, this means that $\nu((0,\delta)) > 0$ for any $\delta > 0$). [If none of the above conditions is satisfied and, for example, $\gamma_0 < 0$, then choose $\varepsilon$ such that $\gamma_0 \leq -\varepsilon$ and $\nu((0,2\varepsilon)) = 0$.]
  • – Shai Covo May 18 '11 at 12:58
  • I know It's super late but could you explain it or redirect me to the proof of ${\sup_{t\le1/n}\vert B_t\vert < \epsilon/2,\ B_{1/n}>0}$ has probability $q/2$? @GeorgeLowther – WhyMeasureTheory Oct 27 '21 at 07:48