I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test.
While I could have done something really boring like for(i < 10000000) { j = 2 * i }, I ended up having the program start with $1$, and then for a million steps choose a random real number $r$ in the interval $[0,R]$ (uniformly distributed) and multiply the result by $r$ at each step.
- When $R = 2$, it converged to $0$.
- When $R = 3$, it exploded to infinity.
So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.
Now when I should be working, I'm instead wondering about the behavior of this script.
Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. $\ddot\smile$