41

Pick a point at random in the interval $[0,1]$, call it $P_1$.

Pick another point at random in the interval $[0,P_1]$, call it $P_2$.

Pick another point at random in the interval $[0,P2]$, call it $P_3$.

Etc...

Let $S = P_1+P_2+P_3+\cdots$

What is the probability that $S$ is divergent?

Any thoughts?

P.S. random, in this particular case, means equidistributed. I.e. $P(a<P_1<b)=b-a$.

nullgeppetto
  • 3,146
Elie Bergman
  • 3,997
  • 6
    Hint: Compute E(Pk) for every k and deduce E(S). If ever E(S) is finite, this tells you that... – Did May 19 '14 at 07:51
  • Well, its seems like E(P_k) = 1/2^k and E(S)=1. Is this legit reasoning? E(P_k)=0.5E(P_k-1) =? 0.5^2E(P_k-2)=...

    Also, if this is correct, it would imply that whatever the probability distribution, no matter how skewed towards 1 it is, the sum is always finite. Since 0<E(P1)<1 thus infinite sum: E +E^2 +E^3 +... converges.

    – Elie Bergman May 19 '14 at 08:01
  • Very much so. Well done. Let me suggest that you post your own solution as an answer (note that, after a while, you may even accept it). – Did May 19 '14 at 08:03
  • 1
    This is a tail event you are talking about and Kolmogorov's 0-1 law says that this probability is either 0 or 1, so which is it? – Georgy May 19 '14 at 08:21
  • @Georgy This seems offtopic to solve the present question. – Did May 19 '14 at 09:07
  • @ Did, This seems to be a perfectly legitimate question in probability, no ? – Georgy May 20 '14 at 07:00
  • @Georgy You stated that Kolmogorov's 0-1 law was involved (and probably that it was the way to go to solve this exercise). In fact Kolmogorov's 0-1 law is not involved in the sense that the easiest proof does not use it (and one might be able to use it but only to solve a part of the exercise and only with care). By the way, what is the tail event you are alluding to, and to which tail sigma-algebra does it belong? – Did May 20 '14 at 07:23
  • @Did In our case, the limit. If we delete any finite number of the X's we still get the same result. That's the definition of tail event. – Georgy May 20 '14 at 08:34
  • @Georgy What, "the limit"? The limit of what? You still did not explain the tail event you are considering in terms of the sequence (Pk) and the tail sigma-algebra it belongs to. – Did May 20 '14 at 08:40

1 Answers1

7

For every $k$, $\mathbb E(P_k)=1/2^k$ thus: $$\mathbb E(S) = \mathbb E(P_1)+\mathbb E(P_2)+\cdots = 1/2 + 1/4 + \cdots= 1$$ Since $\mathbb E(S)$ is finite it follows that $P(S=\infty) = 0$, otherwise the expectation $\mathbb E(S)$ would be infinite.

Did
  • 284,245
Elie Bergman
  • 3,997
  • 3
    Clearly it is possible for the sum to diverge. Do you mean that sum converges with probability $1$? This is not true though since you do need to know something about the PDF, otherwise, the atomic PDF that will always choose the harmonic series gives you a problem. – Ittay Weiss May 19 '14 at 08:21
  • 1
    Yes your right. I was assuming something crazy; that the PDF was in a sense "self similar" at all levels (as the uniform distribution is). – Elie Bergman May 19 '14 at 08:23
  • 5
    "Something crazy"? I fully disagree. Your remark that "No matter what the PDF for our selection, the sum will always be finite" is actually true, if suitably interpreted. To be precise, if $P_k=X_1X_2\cdots X_k$ for every $k$, for some i.i.d. $[0,1]$-valued sequence $(X_k)$, then $S$ is almost surely finite for every distribution of the random variables $X_k$ (the degenerate case when $X_k=1$ almost surely excepted, naturally). In other words, independence and equi-distribution are enough to conclude. (All these are (simple cases of) well-known models.) – Did May 19 '14 at 08:37
  • 8
    I'd say that proving $E(P_k)=2^{-k}$ requires some work... – 5xum May 19 '14 at 08:55
  • Can you elaborate a little here? – Elie Bergman May 19 '14 at 09:07
  • @ElieBergman: It is not too obvious why $E(P_k) = 2^{-k}$. Any proof must pass through the following steps. Let $f_k$ be the density function for $P_k$. Then $E(P_k)$ $= \int_{[0,1]} E(P_k|P_{k-1}=x) f_{k-1}(x) ,dx$ $= \int_{[0,1]} ( \int_{[0,x]} y \frac{1}{x-0} dy ) f_{k-1}(x) ,dx$ $= \int_{[0,1]} \frac{x}{2} f_{k-1}(x) ,dx$ $= \frac{1}{2} \int_{[0,1]} x f_{k-1}(x) ,dx$ $= \frac{1}{2} E(P_{k-1})$. Even if you don't write all the steps, do make sure that you are aware of every one of them. By the way, you must put "@"+ , otherwise only the poster will be notified of your comment. – user21820 May 19 '14 at 14:12
  • @user21820 The hypothesis that $P_k$ has a density $f_k$ is not needed hence I am not sure that literally "all the steps" in the proof you suggest must be made aware of. – Did May 19 '14 at 20:07
  • @Did: Well it depends on how you define expectation I guess? My definition involves the density function. If you use some other identities for expectation wouldn't their proofs pass through the same steps? – user21820 May 20 '14 at 01:31
  • @user21820 This cannot be the only definition you have at hand... Otherwise you are unable to compute, say, the expectation of min(1,U) where U is uniform on (0,3)? – Did May 20 '14 at 05:45
  • @Did: What do you mean? Indeed I can compute the expectation of $\min(1,U)$ where $U$ is uniform on $(0,3)$! $E(\min(1,U)) = \int_{(0,3)} \min(1,x) \frac{1}{3-0} ,dx = \int_{[0,1]} x \frac{1}{3-0} ,dx + \int_{[1,3]} 1 \frac{1}{3-0} ,dx = \frac{1}{6} + \frac{2}{3} = \frac{5}{6}$. – user21820 May 20 '14 at 05:56
  • @user21820 Then the expectation of V uniform on the standard Cantor set? The point is that there are much wilder distributions than the densitable ones, then you need another approach to compute expectations. – Did May 20 '14 at 06:01
  • @Did: For those wild distributions, you are certainly right that there is no density function in the ordinary sense, but I would simply say that those distributions do not exist. If you disagree, tell me how you would draw a uniformly random number from the Cantor set. You cannot. All you can do is to create a procedure that can approximate a random draw with arbitrary precision, but any finite procedure will have an underlying distribution that has a density function. Anyway what exactly is your rigorous way of using expectations that applies to all kinds of distributions? – user21820 May 20 '14 at 06:18
  • @user21820 Wow. To declare that these do not exist is a highly peculiar move, which would put you apart from the mathematical community as a whole (whatver that notion means)... These do exist, exactly as Lebesgue probability measure on [0,1] exist. The "rigorous way of using expectations that applies to all kinds of distributions" (which is universally used from at least 80 years and not particularly "mine") is called measure theory. – Did May 20 '14 at 06:31
  • @Did: I am aware of that 'peculiar' situation, but indeed no mathematician can create or perform a draw from such weird distributions. And I meant to ask how specifically would you justify the statement concerning $E(P_k)$ at the beginning of the proof above, because it is not right to simply claim it without any justification. – user21820 May 20 '14 at 06:36
  • @Did: By the way, I do know about measure theory, but just because it has been used (within mathematics) for a long time does not mean that all of it applies to the real world. But that's for another discussion; for now I just want to see your rigorous approach to this question. =) – user21820 May 20 '14 at 06:41
  • @user21820 "no mathematician can create or perform a draw from such weird distributions" This is false, in every sense of the term (in fact it is strictly equivalent to draw a sample from the Cantor distribution I mentioned or one from the Lebesgue probability measure). To solve the exercise, I would rely on the construction in my first comment, which (1) provides a solid theoretical framework for the model the OP has in mind, (2) allows to compute each $E(P_k)$ and $E(S)$ with no sweat (and without relying on densities) and (3) shows that the result holds in a much wider setting. – Did May 20 '14 at 06:44
  • @Did: Huh? You truly cannot create such distributions in the real world. It's a matter of opinion whether we should bother about such distributions. But I don't see anything in your first comment "Hint: Compute $E(P_k)$..." that shows how $E(P_k)$ is computed in the first place! – user21820 May 20 '14 at 06:49
  • @user21820 You seem intent on making this thread spiralling out of control so let me stick to two points. First, if your beef is with the existence of the Lebesgue probability measure on [0,1] ("in the real world", as you say) then (i) you should have said so from the start and (ii) I am not interested in the discussion, period. Second, if you are asking me how you could solve the OP's mathematical question without using densities, then please refer to my first comment to this answer and use the independence of the random variables $(X_k)$ introduced there. The rest should follow. – Did May 20 '14 at 06:57
  • @Did: I'm sorry you got my intention wrong; like that I was referring to the real world, it's just a misunderstanding. Likewise, I didn't know you were referring to your comment to this answer, and now I see what you mean. You used the nature of the original procedure to obtain the product, which is perfectly fine and gets the result as you claimed. I was looking at each step of the original procedure separately, which would be the case in a generic procedure. Here there is a very special condition that the distribution is merely scaled at each step, without which we don't get a product. – user21820 May 20 '14 at 07:12
  • @user21820 OK. No hard feelings. – Did May 20 '14 at 07:20
  • @Did This is stochastic process. It has a certain probability density and any realization of it is different than the other. In that context the definition of limit has to be modified. Loosly speaking, in some realizations, say n1 this diverges but in others, say n2 the series converges. We are then supposed to estimate the ratio n1/(n1+n2). – Georgy May 20 '14 at 08:19
  • 1
    @Georgy I wonder what you are trying to achieve with this comment. Everybody even vaguely educated in probability theory is aware of the "modification of the notion of" convergence (rather than "limit") involved here, it is called almost sure convergence. There are quite rigorous ways to describe what you try to do with n1 and n2, and they lead to the conclusion that P(the series converges)=1 and P(the series diverges)=0. So, which mysteries remain? I see none. (Unrelated: what happened to your suggestion of a tail event/Kolmogorov's 0-1 law approach, should we forget it altogether?) – Did May 20 '14 at 08:28
  • @Did I didn't know you where aware of almost surely convergence so I said a few words to describe it. And I haven't seen a proof of P(converges)=1 yet. – Georgy May 20 '14 at 08:55
  • @Did And about the Kolmogorov's law, I didn't say I have the answer, but I thought if you are looking for a probability, looking for two numbers is better than looking for any number. – Georgy May 20 '14 at 08:58
  • 1
    @Georgy OK, so in the end, you said nothing about nothing... "And I haven't seen a proof of P(converges)=1 yet" Seriously? Look better then. The rest of your last comments does not seem to make much more sense so it might be better to stop here. – Did May 20 '14 at 09:07
  • @Did I am so sorry for asking easy question. I couldn't show $\mathbb{E}(P_{k})= 1/2^{k}$ by using independence. Could you please help me to show steps ? I should admit that I didn't understand proof by using density functions which I find it hard. – Airbag May 21 '14 at 15:24
  • If $P_k=X_1X_2\cdots X_k$ for some independent $X_i$ such that $E(X_i)=1/2$ for every $i$ then $E(P_k)=$ $____$. – Did May 21 '14 at 17:17
  • @Did Just knowing that $E[X_i]=1/2$ is not enough to deduce the value of $E[P_i]$ – Georgy May 23 '14 at 11:31
  • @Georgy Oh yeah? When the $X_i$ are independent (as recalled once again in my last comment), it is very much "enough". – Did May 23 '14 at 12:05
  • @Did a, ok sorry my bad – Georgy May 23 '14 at 14:46