The elementary probability problem is as follows.
Let $(X_k)_{k\in\mathbb{N}}$ be a sequence of i.i.d. random variables such that $X_k \sim U(0,1)$ for each $k$. Define $\tau := \inf\{n\geq 0: \sum_{i=1}^n X_i > 1\}$. What is $E[\tau]$?
There are plenty of clever solutions to this problem. However, I would like to find a solution that utilizes Doob's optional stopping theorem. I don't know if such a solution exists by the way. I am doing this out of sheer curiosity.
To make life easier I will just assume that $E[\tau] < \infty$. I can prove this later on. So I define $Y_n$ to be $$Y_n = \sum_{i=1}^n X_i$$ Then the compensated process $Z_n := Y_n - \frac{n}{2}$ is a martingale. By hypothesis we have that $Y_{\tau} > 1$ and $Y_{\tau - i} \leq 1$ for $i = 1,2,\ldots,\tau-1$. By optional stopping we have $E[Z_{\tau}] = 0$ (There are a few intermediate steps here that I did not mention but the statement is fine). This yields the bounds $$E[\tau] > 2 \qquad E[\tau] \leq 3$$
Of course we know that $E[\tau] = \exp(1)$. At least the bounds make sense but how do I make them tighter, better yet how do I make the lower and the upper bounds equal?