In our lecture we prove the statement (Erdös-Rényi law of runs):
We consider the probability space $(\Omega:=\{0,1\}^{\mathbb{N}},\mathcal{F},\mathbb{P})$ and define a Bernoulli experiment of length $n$ with probability of success $p$. Let be $R_n$ the length of the longest run, i.e. $$ R_n:=\max\left\{l-k\mid 0\leq k<l\leq n, \frac{S_l-S_k}{l-k}=1\right\} $$ where $S_l$,$S_k$ are the number of successes until the $l$-th and $k$-th step. Then $$ P\left(\lim\limits_{n\to\infty}\frac{R_n}{\ln(n)}\text{ exists and equals }\frac{1}{\ln\left(\frac{1}{p}\right)}\right)=1. $$
The proof relies on the heuristic assumption that the longest run of the Bernoulli experiment of length $n$ is unique, so that we can use the fact $$1=np^{R_n}\implies R_n=\frac{\ln(n)}{\ln\left(\frac{1}{p}\right)}.$$ To make it clear, if we conduct the experiment $n$-times, then there is exactly one tupel $\omega\in\Omega$ which contains $R_n$-many $1$'s in a row, e.g. $\omega=(0,1,0,\underset{R_n-\text{ many}}{\underbrace{1,1,1,1,\dots,1}},1,1,0,0,1,0,1,1,0,\dots)$.
If we conduct the experiment $(n+1)$-times, then there is exactly one tupel $\omega'\in\Omega$ which contains $R_{n+1}$-many $1$'s in a row, e.g. $\omega'=(0,1,0,\underset{R_{n+1}-\text{ many}}{\underbrace{1,1,1,1,\dots,1}},1,1,0,0,1,0,1,1,0,\dots)$. And so on...
I don't understand why we can simply make this assumption? Maybe someone is more familiar with this and can explain it to me?
