4

Came from the problem "Expectation of maximum times of doing $n$ times of flipping coin tests until getting the reverse side".

The probability of maximum times of flipping among $n$ independent tests should be: $$ \begin{align} \text{Pr}(X = 1) &= \frac{1}{2^n} \\ \text{Pr}(X = 2) &= \sum_{i=1}^n {}_n C_i \left(\frac{1}{2^2} \right)^i \left(\frac{1}{2} \right)^{n-i} = \left(\frac{3}{4} \right)^n - \left(\frac{1}{2} \right)^n \\ \text{Pr}(X = 3) &= \sum_{i=1}^n {}_n C_i \left(\frac{1}{2^3} \right)^i \left(\frac{1}{2} + \frac{1}{2^2} \right)^{n-i} = \left(\frac{7}{8} \right)^n - \left(\frac{3}{4} \right)^n \\ & \cdots \\ \text{Pr}(X = k) &= \sum_{i=1}^n {}_n C_i \left(\frac{1}{2^k} \right)^i \left(\sum_{j=1}^{k-1}\frac{1}{2^j} \right)^{n-i} = \left(\frac{2^k-1}{2^k} \right)^n - \left(\frac{2^{k-1}-1}{2^{k-1}} \right)^n \end{align} $$

Therefore, I get the sum of series: $$ E(X) = \sum_{k=1}^{\infty} k \cdot P(X=k) = \sum_{k=1}^{\infty} {k \cdot \left[ (1-2^{-k})^n - (1-2^{-k+1})^n \right]} $$

See also Expectation of the maximum of i.i.d. geometric random variables for similar problem. Unfortunately, there seems to be no closed-form solution to this problem.

  • The answer should be something about $\log n$. – Daniel Wang Apr 10 '23 at 09:38
  • 1
    Welcome to MSE. Your question is phrased as an isolated problem, without any further information or context. This does not match many users' quality standards, so it may attract downvotes, or be closed. To prevent that, please [edit] the question. This will help you recognize and resolve the issues. Concretely: please provide context, and include your work and thoughts on the problem. These changes can help in formulating more appropriate answers. – José Carlos Santos Apr 10 '23 at 09:42
  • Regarding the above comment, just edit the question to include more of "Expectation of maximum times of doing $n$ times of flipping coin tests until getting the reverse side" and how you arrived at this equation. Also, if you are familiar with some methods of attacking this problem or have attempted it but are such somewhere. Also, please restate the problem in the main body of the question, according to this. Your question should be clear without the title. – D S Apr 10 '23 at 10:34
  • Can you clarify the question? "Maximum times" means you get the same side for $X$ times in a row, or you get that sides for $X$ times in the $n$ trials? – D S Apr 10 '23 at 11:05
  • It means the maximum times of getting the same side in a row among $n$ tests – Daniel Wang Apr 10 '23 at 11:14
  • Then how is getting once ($X=1$) has a probability of $1/2^n$? I think I don't understand the problem. Can you provide some examples? – D S Apr 10 '23 at 11:22
  • $X=1$ means all tests got the reverse side in the first trial. Thus, the probability should be $\left(\frac{1}{2}\right)^n$ – Daniel Wang Apr 10 '23 at 11:26
  • 2
    See: A158466/A158467 on :https://oeis.org – Mariusz Iwaniuk Apr 10 '23 at 11:33
  • It seems that this paper The binomial transform and the analysis of skip lists has solved this problem. – Daniel Wang Apr 10 '23 at 11:46
  • $$\sum _{k=0}^{\infty } k \left(\left(1-2^{-k}\right)^n-\left(1-2^{-k+1}\right)^n\right)=\sum _{j=1}^n \frac{2^j \binom{-1+j-n}{j}}{1-2^j}$$ Can't get a closed-form. – Mariusz Iwaniuk Apr 10 '23 at 15:56
  • 2

1 Answers1

4

Partial solution

I'm not sure whether the closed form for the sum can be obtained, but we can get a very simple approximation that perfectly works for a wide range of $n$.

First, we note that $$S(n)=\sum_{k=1}^{\infty} k\left( (1-2^{-k})^n - (1-2^{-k+1})^n \right)=\lim_{N\to\infty}\left((N+1)\Big(1-\frac1{2^{N+1}}\Big)^n-\sum_{k=0}^N\Big(1-\frac1{2^k}\Big)^n\right)$$ $$=\lim_{N\to\infty}\left(N+1-\sum_{k=0}^N\Big(1-\frac1{2^k}\Big)^n\right)=1+\lim_{N\to\infty}\big(N-S_0(N,n)\big)\tag{1}$$ To evaluate the second sum we can, probably, use the Abel-Plana' formula, but it is also convenient in this case to use the Euler-Macklaurin' summation formula. Denoting $\displaystyle f(k)=\Big(1-\frac1{2^k}\Big)^n=e^{n\ln(1-e^{-k\ln2})}$, we notice that $$f(0)=0; \,f(N)=\Big(1-\frac1{2^N}\Big)^n\to 1\,\,\text{at}\,\, N\to\infty\,\,\text{and}\,\, \text{fixed}\, \,n$$ $$f'(k)=n\ln2\frac{\big(1-\frac1{2^k}\big)^n}{2^k-1}=\frac{n\ln2}{2^{nk}}\big(2^k-1)^{n-1};\,f'(0)=0\,\,\text{for}\,\,n>1;\,\,f'(N)\to 0\,\,\text{at}\,\, N\to\infty$$ The same story happens with the higher derivatives: $\displaystyle f^{(j)}(0)\neq0 \,\,\text{only at} \,\,j=n;\,f^{(N)}(0)\to 0\,\,\text{at}\,\,N\to\infty$ At $n\to\infty\,\,f^{(j)}(0)\to 0$.

Therefore, using the Euler-Maclaurin formula, we can approximate the second term in (1) as $$S_0(N,n)\sim\frac{f(N)}2+\int_0^N(1-e^{-k\ln2})^ndk$$ $$=\frac{f(N)}2+k\left(1-e^{-k\ln2}\right)^n\bigg|_0^N-n\ln2\int_0^Nk\left(1-e^{-k\ln2}\right)^{n-1}e^{-k\ln2}dk$$ Making the substitution $x=e^{-k\ln2}$ and keeping in mind that $N\to\infty$ $$S_0(N,n)\sim\frac12+N+\frac n{\ln2}\int_0^1(1-x)^{n-1}\ln x\,dx=N+\frac12-\frac{\psi(n+1)+\gamma}{\ln2}\tag{2}$$ Putting (2) into (1) $$\boxed{\,\,S(n)=\sum_{k=1}^{\infty} k\Big( (1-2^{-k})^n - (1-2^{-k+1})^n \Big)\sim\frac{\psi(n+1)+\gamma+\frac{\ln2}2}{\ln2}\,\,}$$ Just to notice that $n$ is any positive number (not necessarily an integer); at $n\to\infty$ $\displaystyle \psi(n+1)=\ln n+\frac1{2n}+ \,...$

The numeric check with WolframAlpha shows that the approximation works very well for a wide range of $n$ :

$\displaystyle n=1000\quad S(1000)=\color{blue}{11.29780}9990...;\quad\text{approximation}\,=\color{blue}{11.29780}8994...$

$\displaystyle n=100\qquad S(100)=\color{blue}{7.98380}15...;\quad\text{approximation}\,=\color{blue}{7.98380}38...$

$\displaystyle n=10\qquad S(10)=\color{blue}{4.725}55...;\quad\text{approximation}\,=\color{blue}{4.725}60...$

$\displaystyle n=5\qquad S(5)=\color{blue}{3.7941}628...;\quad\text{approximation}\,=\color{blue}{3.7941}536...$

$\displaystyle n=2\qquad S(2)=\color{blue}{2.66}66...;\quad\text{approximation}\,=\color{blue}{2.66}40...$

$\displaystyle n=1\qquad S(1)=2;\quad\text{approximation}\,=1.942...$

Svyatoslav
  • 20,502