3

I am working on a problem on the infinite coin-tossing space and I'm having trouble making any meaningful progress.

Let $(\Omega, \mathcal F, P)$, where $\Omega=\{0,1\}^{\mathbb N}$, $F$ is the $\sigma$-field generated by the finite-dimensional sets, and $P$ is the coin(a fair coin) tossing probability. Finite dimensional sets are defined as sets $A=\{\omega\in\Omega\lvert(\omega_1,\omega_2,...,\omega_n)\in B\}$ for some $n\in\mathbb N$, $B\subset\{0,1\}^n$.

Define $X_n(\omega)=\omega_n \in\{0,1\}$ is the result of the nth toss and define $X(\omega)=\sum_{n=1}^{\infty} X_n(\omega)2^{-n}$.

  • Show that the above series converges for every $\omega \in \Omega$ and defines a random variable taking values in $[0,1]$.
  • Show that $X$ has uniform distribution on $[0,1]$.

My attempt:

  • I believe the series converges (absolutely) by using the comparison test with $\sum_{i=1}^\infty 2^{-n}$, as $X_n\leq 1$, and $\lvert X_n2^{-n} \rvert\leq\lvert2^{-n}\rvert$. Now I'm hoping to show that $$E=\{\omega :X(\omega)\in (a,b)\}\in\sigma(F)$$ as if that holds, I can conclude that any Borel set will be in $\sigma(F)$ as well, since $\{(a,b):a,b\in[0,1]\}$ generates all the Borel sets in $[0,1]$. Now I can certainly find $\omega$'s that will be in $E$, since $X$ is a binary expansion of some real number in $[0,1]$, but I'm a bit stuck at this point.

  • I was given a hint to find $P(X\in[i2^{-m},(i+1)2^{-m}])$ for $i=0,...,2^{m}-1$. I'm not sure how to find the probabilities, and how it can help to show that $X$ has a uniform distribution.

Any help on both questions would be greatly appreciated. Thank you.

Mog
  • 279
  • 1
    Every real number has a binary expansion. – Sank Oct 17 '18 at 04:18
  • @user282639 Could you explain how that relates to the problem? – Mog Oct 17 '18 at 04:57
  • Since $P(X\in[i2^{-m},(i+1)2^{-m}])$ does not depend on $i$ it can help to show that $X$ has a uniform distribution. – kludg Oct 17 '18 at 06:13
  • @kludg I'm not sure how to compute the probabilities; what would the sketch of the proof (even if in general) look like to prove a r.v. is uniform? Also, I've made an edit to the post updating my progress. – Mog Oct 17 '18 at 06:18
  • @pilotmath are you sure that the coin is not necessarily fair? – kludg Oct 17 '18 at 06:26
  • @kludg the coin is fair by the construction of the problem, is it not? – Mog Oct 17 '18 at 06:31
  • Your never mention that the coin-tossing probability is $1/2$, is it? – kludg Oct 17 '18 at 06:34
  • @kludg it is a fair coin, I will edit accordingly. – Mog Oct 17 '18 at 06:37

4 Answers4

3

The point is to use the connection with binary expansions.

Fix $n \in \mathbb{N}$ and consider the sum $A: = \sum\limits_{k=1}^n \frac{1}{2^k} X_k(\omega) = \frac{1}{2^n} \sum\limits_{k=1}^n 2^{n-k} X_k(\omega) $. Observe that the last sum is a binary representation of some integer $0\leq i \leq 2^n-1$, regardless of the outcomes $X_k(\omega)$. Due to the independence of $\{X_n\}$, for each fixed integer $0\leq i \leq 2^n - 1 $ we get $\mathbb{P}( A = \frac{i}{2^n}) = 2^{-n} $ since all bits $X_1(\omega),...,X_n(\omega)$ must coincide with the bits of the given integer $i$. On the other hand, for the rest of the sum in $X(\omega)$ we have $$ 0\leq \sum\limits_{k = n+1}^\infty \frac{1}{2^k} X_k(\omega) \leq \sum\limits_{k = n+1}^\infty \frac{1}{2^k} = \frac{1}{2^n}. $$ It follows that $\mathbb{P}(X(\omega) \in [\frac{i}{2^n}, \frac{i+1}{2^n}] ) = 2^{-n}$. From here we get that $\mathbb{P}(X \leq \frac{i}{2^n} ) = \frac{i}{2^n}$.

Now if $F(x)$ is the cdf (distribution function) of $X$, we have $F(x) = 0 $ for $x\leq 0$, $F(x) = 1$ for $x\geq 1$ by trivial estimates, and $F(x) = x$ if $x \in [0,1] $ and is of the form $\frac{i}{2^n}$ (dyadic rational) for some $n\in \mathbb{N}$ and $0\leq i \leq 2^n-1$. Finally, using the right continuity of $F$ and density of dyadic rationals in $[0,1]$ we conclude that $F (x) = x$ for all $x\in [0,1]$ and hence $X$ has uniform distribution in $[0,1]$.

Hayk
  • 3,775
  • 1
  • 12
  • 25
  • thank you for your answer. Do you have any comment on how I can go about showing that $X$ is a random variable before proving its uniform distribution? – Mog Oct 17 '18 at 07:23
  • $X$ is a limit of random variables (partial sums of the series representing it) hence is a random variable itself , as taking limits ( along with addition, multiplication and other elementary operations ) on random variables preserves the measurability. – Hayk Oct 17 '18 at 07:41
  • I was under the impression that only $\liminf X_n$ and $\limsup X_n$ are random variables, at least the book I'm using has theorem for that only, is that applicable here? – Mog Oct 17 '18 at 07:50
  • If the limit exists, which is the case with $X$, then both $\liminf$ and $\limsup$ coincide with the limit, so you can refer to either of the results you mentioned for measurability of $X$. – Hayk Oct 17 '18 at 07:59
  • thank you for that, does the limit exist simply because we defined $X$ to be the limit of the partial sums, and does having $2^{-n}$ multiplied to each random variable change anything? Also, is my approach as mentioned in the original post a dead-end, or could I possibly take that proof somewhere? – Mog Oct 17 '18 at 08:03
  • When we have an infinite series, it is effectievly a notation for the limit of the sequence of partial sums. In order not to say each time that we have a limit of some sequence, we denote that by an infinite sum symbol. The factors $1/2^n$ are crucial for convergence, they open a door to many nice convergence test for the series, such as the domination criteria you use above. As for your approach, the 1st part tries to prove measurability of $X$ directly from definition, which wouldn't be easy and is not necessary, since that is standard. The 2nd is what we used in the answer above. – Hayk Oct 17 '18 at 08:42
  • that makes sense. Lastly, does having factors $2^{-n}$ on $X_n$ affect measurability of $X_n$ at all? Clearly each $X_n$ is measurable, but I'm not sure if each $X_n2^{-n}$ is measurable? (if it is, then I can extend that to $X$ being measurable as we've discussed above) – Mog Oct 17 '18 at 09:22
  • multiplying by a constant factor does not affect the measurability. Moreover, the product of two measurable functions (random variables) is always measurable. – Hayk Oct 17 '18 at 09:25
  • For $X$ taking values in $[0,1]$, is it enough to mention that $X$ is in fact the formula for finding the binary expansion of any number in $[0,1]$, or would the simpler argument of $0\leq X_n2^{-n}\leq 2^{-n}$ and taking the limit of the partial sums which would bound $X$ s.t. $0\leq X\leq 1$ suffice? – Mog Oct 17 '18 at 09:34
  • the bound $0\leq X_n \leq 1$ is what you need to show $0\leq X \leq 1$. – Hayk Oct 17 '18 at 09:41
  • thank you for your help! – Mog Oct 17 '18 at 13:50
  • btw, I noticed a typo, $2^{n-1}$ should be $2^n-1$ in the 3rd and the 4th line, right? Just wanted to make sure as some people may visit this page in the future – Mog Oct 17 '18 at 17:10
  • you're right, it must be $2^n - 1$, thanks for pointing out, I'll edit the post – Hayk Oct 17 '18 at 17:36
2

The intuition behind the proof is as follows. Let us start from $P(X\in[0,2^{-1}])$; the event of interest can occur in 2 ways:

  • $X_1=0$;
  • $X_1=1$, and all $X_i=0$, $i>1$.

The 2nd way has zero probability and we ignore it, $$P(X\in[0,2^{-1}])=P(X_1=0)=2^{-1}$$

Arguing this way, we prove that $$P(X\in[i2^{-m},(i+1)2^{-m}])=2^{-m}$$ because the event of interest is a unique combination of the values of random variables $X_1,X_2,\ldots,X_m$. For example, $$P(X\in[2\cdot 2^{-2}, 3\cdot 2^{-2}])=P(X_1=1,X_2=0)=2^{-2}$$

kludg
  • 2,667
  • thank you for the clear answer. Do you have any idea how I could move past the impasse(seemingly) on proving the measurability/that $X$ is a random variable? – Mog Oct 17 '18 at 07:14
1

10,000 summations of n=12 coin flips each

my contribution in providing histogram of a 10,000 runs of 12 flips

phdmba7of12
  • 1,068
0

We can use the Central Limit Theorem. Notice that if $U\sim \text{Unif}[0,1]$: $$\phi_U(t)=\int_0^1 e^{ixt}dx=\begin{cases}\frac{e^{it}+1}{it} & t\not=0\\ 1&t=0\end{cases}$$

For $t\not=0$, one has that: $$\phi_{\sum_{k=1}^n \frac{X_k}{2^k}}(t)=\prod_{k=1}^n\mathbb{E}e^{i \frac{X_k}{2^k} t}=\frac{1}{2^n}\prod_{k=1}^n (e^{i \frac{t}{2^k}}+1)=\frac{1}{2^n}\prod_{k=1}^n\frac{e^{i\frac{t}{2^{k-1}}}-1}{e^{i \frac{t}{2^k}}-1}=\frac{e^{it}-1}{2^n (e^{i\frac{t}{2^n}}-1)}$$

Notice that the denominator may be rewritten as: $$ t\left(\frac{\cos(t/2^n)-1}{t/2^n}+i\frac{\sin(t/2^n)}{t/2^n}\right)\xrightarrow{n\rightarrow \infty}it$$ When $t=0$, convergence is even easier as $\phi_{\sum_{k=1}^n}(0)=1=\phi_U(0)$. Therefore, if $Y_n:= \sum_{k=1}^n X_k/2^k $, we have proved that:

$$Y_n\Rightarrow U\quad \text{and}\quad Y_n\rightarrow X \text{ pointwise}$$

It is a general fact that pointwise convergence and convergence in distribution implies $\mathbb{P}_X=\mathbb{P}_U$.

Let us prove this. If $g$ is bounded and continuous, we have that:

$$\mathbb{E}g(X)=\mathbb{E}g(\lim Y_n)=\mathbb{E}\lim g(Y_n)=\lim \mathbb{E}g(Y_n)=\mathbb{E}g(U)$$

Where the equalities are respectively due to $Y_n$ converging to $X$ pointwise, continuity of $g$, the dominated convergence theorem, as $\|g\|_{\infty}<\infty$, and finally by convergence in distribution.

That $\mathbb{E}g(X)=\mathbb{E}g(U)$ for every bounded continuous function is enough to conclude they have the same distribution by this fact.

Kadmos
  • 3,243