3

Let $ X_1, X_2, \ldots $ be independent and identically distributed random variables, each following a uniform distribution on the interval $[0,1]$. Define the random variable $ S $ as follows:

$$ S = \sum_{i=1}^{N} \frac{X_i}{2^i}, $$

where $ N $ is the smallest positive integer $ k $ such that $ X_k < X_{k+1} $. If no such $ k $ exists, then $ N = \infty $. Determine $\mathbb{E}[S]$.


I have tried to use the total expectation formula $\mathbb{E}[S] = \mathbb{E}(\mathbb{E}(S |N))$ and I can figure out:

$$ \Pr(N = n) = \Pr(X_1 \ge X_2 \ge \dots \ge X_n, X_n < X_{n + 1}) = \dfrac{n}{(n + 1)!} $$

But I am having difficulties getting $\mathbb{E}(S |N = n) $ for each $n$

I also wonder if there is any other approaches to this question instead of calculating $\mathbb{E}(\mathbb{E}(S |N))$.

  • Welcome to MSE. Your question is phrased as an isolated problem, without any further information or context. This does not match many users' quality standards, so it may attract downvotes, or closed. To prevent that, please [edit] the question. This will help you recognise and resolve the issues. Concretely: please provide context, and include your work and thoughts on the problem. These changes can help in formulating more appropriate answers. – José Carlos Santos Aug 27 '23 at 16:38
  • Sorry I just added some context and how I tried to solve this question. – Linyuan Gong Aug 27 '23 at 16:49
  • $\Bbb E[S\mid N=n]=\dfrac{\Bbb E[S,\mathbf1_{{N=n}}]}{\Bbb P(N=n)}$, isn't it? – nejimban Aug 27 '23 at 17:21
  • 2
    I doubt that the approach of $\mathbb{E}(\mathbb{E}(S |N))$ is promising, because knowing $N$ alters the distribution of $X_i$ – leonbloy Aug 27 '23 at 19:50

2 Answers2

4

Let $W_k$ be the indicator variable of the event that $X_1 \ge X_2 \ge \cdots \ge X_{k-1} \ge X_k$. Then the sum can be written as

$$ S= \sum_{k=1}^\infty \frac{X_k W_k}{2^k}$$

Now $E[X_k W_k] = P(W_k=1) E[X_k| W_k = 1]$. And $E[X_k| W_k = 1]$ corresponds to the expectation of the minimum of $k$ uniform random variables. Which is $1/(k+1)$

Can you go on from here?

leonbloy
  • 66,202
  • Thank you for your hint! I have just worked out this problem thanks to your hints. I'll write a solution here later on. – Linyuan Gong Aug 27 '23 at 22:11
2

Thanks to leonbloy's hint, I solved this problem, and here is the solution. The result aligns with my simulation results using Python (0.297443):

$$ \begin{aligned} \mathbb{E}[S] &= \mathbb{E}[\sum_{i=1}^N \dfrac{X_i}{2^i}] \\ &= \mathbb{E}[\sum_{i=1}^\infty \dfrac{X_i \mathbf{1}[i \le N]}{2^i}] \\ &= \sum_{i=1}^\infty \dfrac{1}{2^i}\mathbb{E}[X_i \mathbf{1}[N\ge i]] \\ &= \sum_{i=1}^\infty \dfrac{1}{2^i}(\Pr(N\ge i) \mathbb{E}[X_i | N\ge i] + \Pr(N < i)\cdot 0) \\ &= \sum_{i=1}^\infty \dfrac{1}{2^i}\Pr(N\ge i) \mathbb{E}[X_i | N\ge i] \end{aligned} $$

We have

$$ \Pr(N \ge i) = \Pr(X_1 \ge X_2\ge \dots \ge X_i) = \dfrac{1}{i!} $$

and

$$ \mathbb{E}[X_i|N\ge i] = \mathbb{E}[X_i|X_1 \ge X_2\ge \dots \ge X_i] = \dfrac{1}{i+1} $$

So

$$ \begin{aligned} \mathbb{E}[S] &= \sum_{i=1}^\infty \dfrac{1}{2^i} \dfrac{1}{i!} \dfrac{1}{i+1} \\ &= \sum_{i=1}^\infty \dfrac{(1/2)^i}{(i + 1)!} \\ &= 2\sum_{i=2}^\infty \dfrac{(1/2)^i}{i!} \\ &= 2 (e^{1/2} - 1 - 1/2) \\ &= 2 e^{1/2} - 3 \end{aligned} $$