The following nice riddle is a quote from the excellent, free-to-download book: Information Theory, Inference, and Learning Algorithms, written by David J.C. MacKay.
How can you use a (fair) coin to draw straws among 3 people?
The following nice riddle is a quote from the excellent, free-to-download book: Information Theory, Inference, and Learning Algorithms, written by David J.C. MacKay.
How can you use a (fair) coin to draw straws among 3 people?
To expand on Tobias' comment, designate that Player 1 wins on TT (Tails followed by Tails), Player 2 wins on TH, and Player 3 wins on HT. On HH, simply discard the result and flip again. Since all outcomes are equally likely and each player has a single winning outcome, the winner will be chosen uniformly at random.
Let $p_1, p_2, \ldots, p_n$ be probabilities adding up to $1$. Each $p_k$ represents the desired probability for the $k$th straw to be drawn.
Let
Consider the intervals $I_1 = (0, q_1), I_2=(q_1, q_2), \ldots, I_n=(q_{n-1},q_n)$ of the real line. We are going to produce a uniformly random real number between $0$ and $1$ and see which interval it lies in. The probability that it lies in the $k$th interval will then be $p_k$.
We will construct our number by flipping a coin repeatedly to form the bits of a binary fraction. In practice, we will stop as soon as we can determine which interval it lies in, but for simplicity we will deal with the entire sequence at once.
Let $r_1,r_2,\ldots$ be the results of the coin flips, where 1 represents heads and 0 represents tails.
Let $r = 0.r_1r_2\ldots$, interpreted as a binary fraction.
We will assume without proof that this procedure produces a uniform random distribution.
We will consider the bits of $r$ one at a time (as the coin is flipped).
Looking at the first $j$ bits of $r$ will allow us to see that $r$ lies in a certain interval $K_j$ of length $2^{-j}$: each additional bit halves the length of the interval. For example, if the first bit is $1$, then we know that $r \in [0.1,1]$. If the first bit is $1$ and the second is $0$, then we know that $r \in [0.1,0.11]$.
Note that once $2^{-j} < \min \bigl\{p_k: 1 \le k \le n\bigr\}$, or, equivalently, $j > \log_2 \bigl(1 \big/ \min \{p_k: 1 \le k \le n\}\bigr)$, $K_j$ will intersect at most two intervals $I_m, I_{m+1}$.
At this point, the only remaining question is whether $r < q_m$, $r > q_m$, or $r = q_m$. Since the case of $r = q_m$ has probability $0$, we will assume that it does not occur.
Then by the time $2^{-j} < \big|r - q_m\big|$, $K_j$ will lie entirely to the left or entirely to the right of $q_m$, and $r$ will have been determined to lie in $I_m$ or $I_{m+1}$.
The closer $r$ is to $q_m$, the longer the procedure will take, but as long as they are not actually equal, it will eventually terminate.
A somewhat more practical description:
Represent $q_0, q_1 \ldots, q_n$ in binary:
$q_0 = 0.000\ldots$, $q_1 = 0.b_{11}b_{12}\ldots, q_2 = 0.b_{21}b_{22}\ldots$.
Flip a fair coin. If the coin comes up $0$, then our number is in the interval $(0, 0.1)$, so we discard from consideration any interval whose left endpoint has a first bit of $1$. That is, if $b_{k-1,1} = 1$, then we discard $I_k$. If the first coin comes up $1$, we discard from consideration any interval whose right endpoint has a first bit of $0$. That is, if $b_{k,1} = 0$, we discard $I_k$.
Flip the coin again, this time focusing on the second bit of each endpoint.
Keep flipping until only one interval is left.
Note: there is probably a more efficient/effective to describe this procedure, but I don't remember where I read of it.