14

Problem: Let $\mathbf{X} = (\mathbb{Z}_2)^\mathbb N$, i.e., $\mathbf{X} = (X_1,X_2,\cdots,X_N,\cdots)$, $X_i\in \{0,1\} $. It can be considered as countable lightbulbs. $0$ means off, $1$ means on. We start with $\mathbf{X}_0 = 0$. Keep generating independent geometric random variables, whose distribution are $geom(1/2)$. Denote them as $K_1, K_2,\cdots$. Now let $\mathbf{X}_m$ (for $m \ge 1$) be as follows $$(\mathbf{X}_m-\mathbf{X}_{m-1})_k = \mathbf{1}(k = K_m), $$ i.e, in the $m$- th turn, we only change the status of the $K_m$-th light bulb. Then what is the probability of all lights being off again, i.e., $$\mathbb P(\exists m>1, \mathbf{X}_m =0)$$

My first intuition below was wrong I’m afraid.

My intuition is that since $1/4+1/16+1/64+\cdots=1/3$, the final answer might be $1/2$. Since all lights being off means all lights are encountered even times, I tried to use generating function but it doesn't work since it's hard to derive a correspondence between all lights are encountered even times and generating function coefficients, and brute force calculation seem also hard due to the same reason that there're too many cases to consider, and I'm stuck here. Are there any other thoughts to deal with this question? Thanks!

New intuition: We may define a function $f(x_1,x_2,\cdots)$ as the probability that status $\mathbf{X}=(x_1,x_2,\cdots)$ will eventually become all $0$, then our goal is to calculate $\sum_{n=1}^{\infty}\frac{1}{2^n}f(0,\cdots,0,1,0,\cdots)$ where $1$ only appears on $n$-th term, and we can achieve all the transporting equations of $f$, which is an uncountable dimensional equation system. We can see immediately that the equation system has a solution $f(x_1,x_2,\cdots)\equiv 1$. Can we conclude that this solution is unique (Then the answer to this problem is $1$)?
Continue the intuition above, we define $$g(x_1,x_2,\cdots)=f(x_1,x_2,\cdots)-1,$$ moreover we define $g(0,0,\cdots,0,\cdots)=0$.
Obviously $f\le1$, hence $g$ can achieve a maximum at $(0,0,0,\cdots,0,\cdots)$. Note that after changing all the equations involving $f$ into $g$, the constant term will be cancelled and the coefficients at the righthand side is strictly larger than $0$ and the sum of all coefficients are always $1$. For example, the equation $$f(1,0,0,\cdots,0,\cdots)=\frac{1}{2}+\frac{1}{4}f(1,1,0,\cdots,0,\cdots)+\frac{1}{8}f(1,0,1,\cdots,0,\cdots)+\cdots$$ is changed into $$g(1,0,0,\cdots,0,\cdots)=\frac{1}{2}g(0,0,0,\cdots,0,\cdots)+\frac{1}{4}g(1,1,0,\cdots,0,\cdots)+\frac{1}{8}g(1,0,1,\cdots,0,\cdots)+\cdots$$ We want to use the maximum principle, but we lack an equation with $\text{LHS}=g(0,0,0,\cdots,0,\cdots)$, are there any ways to supple this equation?

Edit: Another possible intuition from here: ignore $g(0,0,0,\cdots,0,\cdots)$, we only focus on the remaining term. We wish to discard the term that has infinity $1$ since the probability of getting this term is $0$. If the maximum happens on one remaining element we're done, what we don't want is if the maximum happens on the limiting term, and we wish to prove that this won't happen because the limit is (somewhat?) decaying. (I don't know how to proceed here, but I'll post my intuitions these days anyway)

Ho-Oh
  • 956
  • 1
    Based on Python simulations, it looks like the relative frequency of the game never ending is less than $20%$. Equivalently, the game ends more than $80%$ of the time. [I don't rule out the possibility that the game ends with probability $1$]. – paw88789 Apr 19 '22 at 15:05
  • Hmm may be the answer is $1$, I have the intuition that the number of lights on in “in somewhat manner diminishing”, but I don’t know how to express it in mathematical manner – Ho-Oh Apr 19 '22 at 15:10
  • I don't know if it can be used in any way (I wasn't able to) but by recursion the probability that the $k$-th bulb is off at the $m$-th step is $\frac{1}{2} + \frac{1}{2} \left(1 - \left(\frac{1}{2} \right)^k \right)^m$ – Federico Apr 19 '22 at 17:50
  • @Federico Since when $m$ tend to infinity all the probability is tend to $1/2$, I suppose that this conclusion is more likely to prove the ‘opposite’ results️ – Ho-Oh Apr 20 '22 at 01:18
  • @Ho-Oh I'm not entirely sure about that: there is a double limit involved ($m \to \infty$ and $k \to \infty$) and to draw your conclusion you are first taking the limit for $m \to \infty$ and then for $k \to \infty$, whereas to me it seems we should take them in the reverse order. It is not clear to me if we can exchange the order in this case. – Federico Apr 20 '22 at 17:35
  • 2
    This can be regarded as a (non-uniform) random walk over an infinite-domensional hypercube – leonbloy Apr 21 '22 at 14:28
  • @leonbloy This is my intuition, and I think this can be the way to the solution. Do you know any conclusions about random walk over an infinite-dimensional hypercube than might be helpful to this problem? – Ho-Oh Apr 21 '22 at 14:31
  • @Federico But that claim holds for a fixed $k$ as $m \to \infty$, right? That seems potentially interesting, but I'm not sure yet if it's important. – Aaron Montgomery Apr 21 '22 at 16:03
  • @AaronMontgomery Yes, it holds for a fixed $k$ as $m \to \infty$. However, I have to make a correction, the probability of the $k$-th bulb being off at the $m$-th step is $\frac{1}{2} + \frac{1}{2} \left( 1 - \left(\frac{1}{2}\right)^{k-1}\right)^{m}$ (the exponent is $k-1$, not $k$). – Federico Apr 23 '22 at 13:17

5 Answers5

9

It already accumulated more than 10 upvotes, so it is, probably, time to post a solution. Leonbloy essentially said everything there was to say: it is a Markov chain, so showing that the probability in question is $1$ is equivalent to showing that it is recurrent. The standard recurrence criterion is $\sum_m P_m=+\infty$ where $P_m$ is the probability to have all the lights off after $m$ steps. The only issue is finding (or estimating) $P_m$.

By a direct computation, $$ P_m=\sum_{(k_1,\dots,k_m)\in Q_m} 2^{-(k_1+\dots+k_m)} $$ where $Q_m$ is the set of $m$-tuples in which every number occurs even number of times. The direct combinatorial evaluation of $P_m$ seems not easy (unless I'm missing some trick) but it is clear that the same quantity can be expressed as $$ E\left[\sum_{k\ge 1} 2^{-k}Y_k\right]^m $$ where $Y_k$ are independent random variables taking the values $\pm 1$ with probability $1/2$ (just open the parentheses and distribute the expectation). For even $m$, this expectation is at least $$ P\{Y_1=Y_2=\dots=Y_\ell=1\}(1-2^{-(\ell-1)})^m=2^{-\ell}(1-2^{-(\ell-1)})^m $$ for every $\ell>0$. Optimizing over $\ell$, we get a lower bound for $P_m$ comparable to $1/m$ for even $m$, so the series $\sum_m P_m$ diverges and we are done.

fedja
  • 19,348
7

I believe that with probability $1$ the lights will all be off again at some point. The argument:


For $n$ even, let $p_n$ be the probability that all the light bulbs are off after $n$ steps. The main idea of our argument will be to show that

Claim: There is a constant $C>0$ such that $$p_n \geq \frac{C}{n}$$

Why is the Claim Enough? If the claim is true, then the expected number of times at which all the bulbs are off is infinite (The Harmonic Series Diverges), which implies that the probability all the bulbs are off at some point is $1$. (This is a standard idea for random walks -- if the probability of return was $p<1$, then the expected number of returns would be $p+p^2+p^3+\dots$, which would be finite).

A reformulation of the Problem: Consider sets $A_1, A_2, \dots, $ generated as follows:$A_1 = \{1, \dots, n\}$. $A_{k+1}$ is generated from $A_k$ by independently keeping each element with probability $\frac{1}{2}$.

We can think of $A_i$ here as corresponding to all the times at which we change a bulb located in position $i$ or later. The bulbs will all be off exactly when all of the $A_i$ are of even cardinality. So we're trying to bound the probability we keep on getting sets of even cardinality until we reach the empty set (at which point all subsequent sets are automatically even cardinality)

The Intuition: Why $\frac{C}{n}$? If we're given $A_k$ (and $A_k$ is non-empty), then $A_{k+1}$ is a randomly chosen subset of $A_k$. Exactly half of those subsets are non-empty, so we "survive" (our next set has even cardinality) with probability $\frac{1}{2}$. On the other hand, our set decreases in size by about $\frac{1}{2}$ each time we take a subset, so it'll take roughly $\log_2 n$ rounds before we're down to the empty set. The chance we survive those rounds should be roughly $2^{\log_2 n} = \frac{1}{n}$.

Making this All Precise: Our intuition above had two parts: We survive with probability $\frac{1}{2}$ with each new subset, and we're done after roughly $\log_2 n$ rounds. The goal of the following lemma is to decouple those two observations in a certain sense.

Lemma: Suppose $A_1$ is given and nonempty, and $t$ is a positive integer. Then $$P(A_2 \textrm{ has even size} | A_t = \emptyset) \geq \frac{1}{2})$$

Proof: We have \begin{eqnarray*} P\left(|A_2| \, \textrm{even} \wedge A_t = \emptyset\right)-P\left(|A_2| \, \textrm{odd} \wedge A_t = \emptyset\right) &=& \sum_{k=0}^n (-1)^k P\left(|A_2|=k\right) P\left(A_t = \emptyset \biggl| |A_2|=k\right) \\ &=& 2^{-n} \sum_{k=0}^n (-1)^k \binom{n}{k} P\left(A_t = \emptyset \biggl| |A_2|=k\right) \\ &=& 2^{-n} \sum_{k=0}^n (-1)^k \binom{n}{k} \left(\left(1-2^{-(t-1)}\right)^k \right) \end{eqnarray*} (each element of $A_2$ dies by $A_t$ with probability $1-2^{-(t-1)}$, and $A_t$ is empty when all of the elements of $A_2$ die)

By the Binomial Theorem, this simplifies down to $$2^{-n} \left( 1 - (1-2^{-(t-1)}) \right)^k > 0$$ So even is at least as likely as odd.

Given this Lemma, we now prove the main claim. Let $m=\lfloor \log_2 n \rfloor +2$. Let $B_1$ be the event that $A_m$ is empty, and $B_2$ be the event that all of $A_1, A_2, \dots, A_m$ are even in size. If $B_2$ and $B_1$ both hold, then all the lights are off at time $n$.

By repeated applications of the lemma, we get that $$P(B_2 | B_1) \geq 2^{-m}$$ It follows that $$P(B_2 \wedge B_1) \geq 2^{-m} P(B_1)$$

We can bound the probability that $A_m$ is non-empty by $$P(B_1^C) \leq n P(1 \in A_m) \leq n 2^{-m} \leq n 2^{-(\log_2 n +1)} \leq \frac{1}{2}$$ It follows that $P(B_1) \geq \frac{1}{2}$, and $$p_n \geq P(B_2 \wedge B_1) \geq \frac{1}{2} 2^{-m} \geq \frac{1}{2} 2^{\log_2 n +2} = \frac{1}{8n}$$

  • 1
    This is, honestly, one of the best proofs I've seen, and I've seen many here! It's good to work for 6 hours on a problem without getting it : it's even better when gets to read a solution after that. I expected harmonic-like decay, but I couldn't quite pull off the mathematics. Good to see that three other people here could pull it off. – Sarvesh Ravichandran Iyer Apr 24 '22 at 04:36
4

Let $u_k$ be the probability that all lights are off after $2k$ steps

We decompose that event according to the number of flips done for the first bulb, using the law of total probability and the property that a truncated geometric distribution is again geometric. Letting $2j$ be the number of flips done for the first bulb, we get the following recursion:

$$ u_k =\frac{1}{2^{2k}} \sum_{j=0}^k \binom{2k}{2j} u_{j} \iff u_k=\frac{1}{4^k-1} \sum_{j=0}^{k-1} \binom{2k}{2j} u_{j}\tag{1}$$

with the initial condition $u_0=1$. The solution [*] can be obtained explicitly, and is surprisingly simple:

$$u_{k}=\frac{1}{2k+1} \tag 2$$

Then, the sum $\sum_{k=0}^\infty u_k$ diverges; and by the properties of Markov chains, the state is recurrent, and the probability of returning to zero is $1$.

If we wish to compute the probability of returning with finite steps: Letting $f_k$ be the probability of getting the first "return to zero" (first time lights are off) after $2k$ steps (with $f_0=0$), we have the know relationship $$ u_k = f_1 u_{k-1} + f_2 u_{k-2} +\cdots +f_k u_0 \tag 3$$

I've not found an explicit formula for $f_k$. The first values are $$ \begin{array}{[c|c|c]} k & u_k & f_k\\ 1 & 0.333333 & 0.333333 \\ 2 & 0.200000 & 0.088888 \\ 3 & 0.142857 & 0.046561 \\ 4 & 0.111111 & 0.030194 \\ 5 & 0.090909 & 0.021797 \\ \end{array} $$

The convergence is very slow, though, $\sum_{j=1}^k f_k \to 1 - \frac{a}{\log(k)}$ and the probability of having returned to zero after $1400$ steps is still below $0.8$.

enter image description here


  • Added: the solution of $(1)$:

Let $S(x)= \left( 1+x \right)^{2k}=\sum_{i=0}^{2k} \binom{k}{i} x^i $ and

$$G(x)= \frac12 \left( S(x) + S(-x) \right) = \sum_{j=0}^{k} \binom{2k}{2j} x^{2j}$$

Then evaluating $\int_0^1 G(x) dx $ we get $$ \sum_{j=0}^{k} \binom{2k}{2j} \frac{1}{2j+1}={2}^{2k} \frac{1}{2 k+1} $$

Hence indeed $(2)$ is solution of $(1)$, with the right initial condition $u_0=1$.

leonbloy
  • 66,202
0

First, let's compute the probability $\mathbb{P}(\mathbf{X}_m = 0)$. For parity reasons, this probability is $0$ when $m$ is odd. For $m$ even, it is more interesting.

Notice that $\mathbf{X}_{2m}=0$ is the same as choosing $k_{1}, ..., k_{p} \geq 1$ distincts integers which are the lightbulbs chosen during the process, $n_{1}, ..., n_{p} \geq 1$ the number of times the associated lightbulb is switched and finally the different times at which the lightbulbs $k_{i}$ is chosen. As we want all lightbulbs switched off after $2m$ steps, all $n_{i}$ are even and $\sum_{1 \leq i \leq p} n_{i} = 2m$. So, instead of $n_{i}$, we will write $2n_{i}$.

Combining all these aformentioned observations, the probability we are looking for is simply given by the inspiring formula below: $$ \mathbb{P}(\mathbf{X}_{2m}=0) = \sum_{k_{1},...,k_{p} \geq 1, n_{1},...,n_{p} \geq 1 : \sum_{i=1}^{p}n_{i} = m}\binom{2m}{2n_{1},\cdots, 2n_{p}}4^{-k_{1}n_{1} - \cdots -k_{p}n_{p}} $$ Now, consider the function: $$ f(x) = \prod_{k=1}^{+\infty}\cosh(2^{-k}x) = \prod_{k=1}^{+\infty}\left(\sum_{n=0}^{+\infty} \dfrac{4^{-kn}x^{2n}}{(2n)!}\right) $$Expand the former product and identify the coeffients in your power series to get: $$ f(x) = \sum_{m=0}^{\infty} \mathbb{P}(\mathbf{X}_{2m}=0) \dfrac{x^{2m}}{(2m)!} $$ Moreover, the relationship $\sinh(2x)=2\sinh(x)\cosh(x)$ implies that $$ f(x) = \dfrac{\sinh(x)}{x} = \sum_{m=0}^{+\infty}\dfrac{x^{2m}}{(2m+1)!} $$ Hence: $$ \mathbb{P}(\mathbf{X}_{2m} = 0)=\dfrac{1}{2m+1} $$ Now, we are finally able to compute the probability $\mathbb{P}(\exists m \geq 1, \mathbf{X}_{m} = 0)$. For this, let $\tau = \sup\{ m \geq 1 : X_{m}=0\} \in \mathbb{N} \cup \{+\infty\}$. Using the fact that the sequence of geometric law is iid, we are left with: $$ \begin{aligned} \mathbb{P}(\exists m \geq 1, \mathbf{X}_{m} = 0)&=\sum_{m=1}^{+\infty} \mathbb{P}(\tau = m)+ \mathbb{P}(\tau = +\infty)\\ &\geq \sum_{m=1}^{+\infty} \mathbb{P}(\tau = m)\\ &=\sum_{m = 1}^{+\infty} \mathbb{P}(\mathbf{X}_{m} = 0, \forall k > m, \mathbf{X}_{k} \neq 0)\\ &=\sum_{m = 1}^{+\infty}\frac{1}{2m+1}\mathbb{P}(\forall k > 2m, \mathbf{X}_{k} \neq 0)\\ &=\mathbb{P}(\forall k >0, \mathbf{X}_{k} \neq 0)\sum_{m = 1}^{+\infty}\frac{1}{2m+1}\\ &=\left[1-\mathbb{P}(\exists m > 0, \mathbf{X}_{m}=0)\right]\sum_{m = 1}^{+\infty}\frac{1}{2m+1} \end{aligned} $$ Now, $\sum_{m = 1}^{+\infty}\frac{1}{2m+1} = +\infty$. Hence, as $\mathbb{P}(\exists m \geq 1, \mathbf{X}_{m} = 0)$ is bounded, the only possibilty left is $$ \mathbb{P}(\exists m \geq 1, \mathbf{X}_{m} = 0)=1 $$

0

As noted in many other answers, it suffices to show a $\Omega(1/t)$ bound on the chance the process has returned to the starting state at time $t$. This answer gives another way of showing such a lower bound, specifically by considering a continuous time version of the process. The advantage of this approach is that in the continuous-time version, each bit evolves independently. This reduces the required combinatorics, albeit at the cost of introducing some continuous-time Markov chain (CTMC) theory.

Let $\{\mathbf{Y}(t)\}_{t \in [0, \infty)}$ be a CTMC on $\{0, 1\}^\infty$ such that in all states, there is a rate $2^{-i} \lambda$ transition that flips bit $i$. Put another way, we can define $\mathbf{Y}(t)$ as the CTMC where

  • the total transition rate from each state is $\lambda$, and
  • the state after $m$ transitions is $\mathbf{X}_m$.

As in other answers, it suffices to show that $\mathbb{P}[\mathbf{Y}(t) = \mathbf{0}] \geq \Omega(1/t)$. (Specifically, this implies $\mathbf{Y}(t)$ spends infinite time in expectation in state $\mathbf{0}$. Because each visit has length distributed as $\mathrm{Exponential}(\lambda)$, this implies infinitely many visits, so both $\mathbf{Y}(t)$ and $\mathbf{X}_m$ are recurrent.)

The key observation is that in the continuous-time process, each bit $Y_i(t)$ evolves independently, flipping at rate $2^{-i} \lambda$ regardless of the states of the other bits. This means $$ \mathbb{P}[\mathbf{Y}(t) = \mathbf{0}] = \prod_{i = 1}^\infty \mathbb{P}[Y_i(t) = 0] $$ At this point, the rest of the solution is similar to other solutions: we lower bound $\mathbb{P}[Y_i(t) = 0]$, then bound the resulting infinite product, getting an $\Omega(1/t)$ bound at the end of the day.

It is not too hard to compute $\mathbb{P}[Y_i(t) = 0]$ exactly, but for our purposes, two simple lower bounds suffice: $$\begin{aligned} \mathbb{P}[Y_i(t) = 0] &= \frac{1}{2} + \frac{1}{2} \exp(-2^{-(i - 1)} \lambda t) \\ &\geq \max\biggl\{\frac{1}{2}, \exp(-2^{-i} \lambda t) \biggr\}. & &(*) \end{aligned}$$ This means, letting $\ell = \lceil \log_2 (\lambda t) \rceil$, $$\begin{aligned} \mathbb{P}[\mathbf{Y}(t) = \mathbf{0}] &\geq 2^{-\ell} \prod_{i = \ell + 1}^\infty \exp(-2^{-i} \lambda t) \\ &\geq \frac{1}{2 \lambda t} \prod_{i = \ell + 1}^\infty \exp(-2^{-(i - \ell)}) \\ &= \frac{1}{2 \lambda t} \exp\biggl(- \sum_{i = 1}^\infty 2^{-i}\biggl) \\ &= \frac{1}{2 e \lambda t}, \end{aligned}$$ so $\mathbb{P}[\mathbf{Y}(t) = 0] \geq \Omega(1/t)$, as desired.

For completeness, here are ways to see the lower bounds in $(*)$ directly:

  • We have $Y_i(t) = 0$ if (but not only if) bit $i$ has not transitioned by time $t$, which happens with probability $\exp(-2^{-i} \lambda t)$.
  • The stationary distribution of bit $i$ is clearly $\bigl[\frac{1}{2}, \frac{1}{2}\bigr]$. A straightforward coupling argument shows that starting with bit $i$ set to $0$ (as opposed to starting with bit $i$ distributed according to the stationary distribution) only increases the chance of bit $i$ being $0$ at any future time.
Ziv
  • 1,279