1

Consider a discrete random walk taking values +1 or -1 with probabilities p and q, respectively. Let $S_n = \sum_{k=1}^{n}X_k$. Let $[-A,B]$ be an interval, $A,B \geq 1$. Now define $$\tau =\min(n:n \geq 0, S_n \leq -A \cup S_n \geq B).$$

I need to show that $\tau$ is finite almost surely.

What I am trying: $P_0(\tau < \infty) = P_0(\exists n<\infty \mbox{ so that } {S_n=-A}\mbox{ or }{S_n=B})$

Can someone put me in the right direction here? My hint was to use strong law of large numbers.

Kerry
  • 795
  • 2
    The hint to use some law of large numbers works like a charm when $p\ne q$ and fails miserably when $p=q$. Do you assume that $p\ne q$? (If you do not, you should sue the author of the "hint"...) – Did Oct 07 '15 at 14:49
  • @Did The author makes no assumptions about $p$ and $q$. – Kerry Oct 08 '15 at 03:53
  • Then, can you think of a basic fact implying the result when $p\ne q$? – Did Oct 08 '15 at 06:07
  • @Did of course. The super/sub martingale would tend to positive /negative infinity in the limit. – Kerry Oct 08 '15 at 13:48

2 Answers2

5

We need notation. Say $X_1,\dots$ are iid with the specified distribution, and $$S_n=X_1+\dots+X_n.$$

Say $N=A+B$. Note that if there exists $n$ with $$X_n=X_{n+1}=\dots=X_{n+N-1}=1$$then $\tau<\infty$ (because why, exactly?).

So let $E_n$ be the event $$X_{nN}=X_{nN+1}=\dots=X_{(n+1)N-1}=1.$$The events $E_n$ are independent, and $\bigcup_n E_n$ implies $\tau<\infty$. So it's enough to show $$P\left(\bigcup_nE_n\right)=1.$$This follows from Borel-Cantelli.

  • So you are proposing that the problem is equivalent to showing the stopping time to hit A+B is finite. That makes sense. I don't understand how you are defining the event $E_n$. – Kerry Oct 06 '15 at 14:55
  • 1
    No, I didn't say anything about the time to hit $A+B$. I don't understand what you don't understand about the definition of $E_n$. – David C. Ullrich Oct 06 '15 at 15:12
  • I am still trying to digest your argument. So if $E_n$ happens, we guarantee the stopping time is finite almost surely? – Kerry Oct 07 '15 at 14:13
  • 2
    No. If $E_n$ happens then the stopping time is finite. Say $F$ is the event "stopping time is finite". Then $E_n\subset F$. (Hence the stopping time is finite if $\bigcup_nE_n$ happens, and it turns out that that union has probability $1$.) – David C. Ullrich Oct 07 '15 at 14:18
  • Would it be possible to make your definition of $E_n$ more clear? I am trying to use examples to understand what it is, but I am still confused by the meaning. – Kerry Oct 07 '15 at 14:44
  • @Ryan Would it be possible to make clearer the problem you have with the explicit definition of En in this post? – Did Oct 07 '15 at 14:47
  • @Did Sorry I am being unclear. I don't what to say though. I just don't understand why we wish to study that particular event. I understand everything written before $E_n$ is introduced. – Kerry Oct 08 '15 at 03:52
  • 2
    We study $E_n$ because it's simple enough that we can figure out things about it, but it also has the property that it implies a finite stopping time. If at some point you take $A+B$ steps in the same direction you must have hit the wall, right? – David C. Ullrich Oct 08 '15 at 04:29
  • @DavidC.Ullrich I agree with you in the fact that if you take A+B steps in the same direction, then you must hit the wall. Maybe it would help if you added another realization of $E_n$ in between the 'dot dot dot'. Again, sorry that I cannot be more helpful in explaining what I don't understand. – Kerry Oct 08 '15 at 04:39
  • 3
    $$E_n=\bigcap_{k=1}^N{X_{nN+k-1}=1}.$$ – Did Oct 08 '15 at 06:05
4

Consider the following extremely useful proposition "What always stands a reasonable chance of happening will (almost surely) happen - sooner rather than later."

Let $T$ a stopping time such that for some $N$ and $\epsilon>0$ we have for all $n$ that $P(T \leq n + N|\mathcal{F}_n) > \epsilon$ a.s., then $E[T] < \infty$, and in particular $T<\infty$ a.s.

Hint: Using induction and $P(T > kN) = P(T>kN; T>(k-1)N)$ show $P(T > kN) \leq (1-\epsilon)^k$.

Now how can we apply the proposition (which is exercise E10.5 in David Williams' Probability with Martingales)? No matter where you are in your random walk, you always have a fixed positive probability $p^{A+B}>0$ that the next $A+B$ coin flips are heads, in which case you stop in at most $A+B$ steps. Thus the proposition implies $E[\tau] < \infty$.

nullUser
  • 28,703
  • "What always stands a reasonable chance of happening will (almost surely) happen - sooner rather than later." Of course this needs to be qualified: assume that $X_n=X_1$ with $X_1$ standard normal then $P(X_n>0)=\frac12$ for every $n$ hence $X_n>0$ "always stands a reasonable chance of happening" and yet $P(\exists n,X_n>0)\ne1$. – Did Oct 08 '15 at 13:56
  • Yes, one should be careful to always keep the theorem statement in mind so as to not let the language deceive. In your case $P(X_n > 0) = \frac{1}{2}$, but the theorem would require $P(X_n > 0 \cup \ldots \cup X_{n+N} > 0 | \mathcal{F}_n) > \epsilon$, which is of course false. – nullUser Oct 11 '15 at 23:36
  • nullUser, have you any idea how to show such proposition applies to any stopping time (when applicable) ? http://math.stackexchange.com/questions/1562635/asymmetric-random-walk-prove-that-et-inf-n-x-n-b-infty – BCLC Dec 11 '15 at 10:03
  • nullUser what are the $N$ and $\epsilon$ in this case? – BCLC Dec 12 '15 at 17:31
  • 1
    @BCLC $N=A+B$ and $\epsilon = p^{A+B}$. – nullUser Jan 15 '16 at 19:09
  • @nullUser aaahhhh thanks! ^-^ – BCLC Jan 15 '16 at 19:21