You have a fair coin. Can you use this coin to generate a random event with 5 1 chance of success?
Of course you can! For example, if you toss the coin 3 times, you get eight possible tosses HHH, HHT, HTH, HTT, THH, THT, TTH, TTT. You can designate the first (HHH) as a success, the next four (HHT, HTH, HTT, THH) as a failure, and the remaining three indicates that you redo it all over again.
We can prove that the expected number of coin tosses used in this method is
3
×
1
−
8
5
1
=
5
2
4
.
Find the strategy that minimizes the expected number of coin tosses that are needed. The expected value can be expressed as b a , where a , b are positive integers that are relatively prime. Enter your answer as a + b .
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Good clear explanation of this setup.
What happens in the case of dyadic fractions?
Re Challenge Master: In case of dyadic fractions:
For the special case p = 0 , 1 , the answer is 0.
Otherwise, p = 2 b a where a is an odd positive integer and b is a positive integer. The answer is 2 ( 1 − 2 b 1 ) . The algorithm that does this is almost identical (except that in step 3, we replace x < 1 to x ≤ 1 ). After b iterations, we get x = 1 and thus we will exit. Proof that this is the minimum is also similar; if we stop after less than b tosses, we don't have enough branches, so we need some way to reach b tosses.
Log in to reply
Great! It's slightly surprising (at least initially) that it takes < 2 tries on average to determine an outcome with probability of 8 1 . "So most of the time you only flip one coin?"
Can you shorten it please?
Log in to reply
I rephrased the above solution. The solution itself is under the Solution heading. The motivation for that strategy is provided under Motivation.
Problem Loading...
Note Loading...
Set Loading...
Overview
The expected number of coin tosses required is 2, giving a + b = 2 + 1 = 3 .
More surprisingly, we can replace 1/5 with any probability p , rational or otherwise, that is not a dyadic fraction (has denominator that is a power of 2), and the answer is still 2. (For dyadic fractions, the answer is less than 2.)
In the following, let p be the probability of success we want that is not a dyadic fraction.
Solution
The strategy is surprisingly simple. We do the following algorithm:
Note that every iteration tosses exactly one coin and exits half of the time. (On which step the iteration can potentially exit depends on x , but in either case it exits when the coin tosses a certain way and repeats if it tosses the other way, so it's half chance anyway.) If E is the expected number of coin tosses in this strategy, then E = 1 + 2 1 ⋅ E : we need one coin toss for the iteration, and half of the time we need to redo the iteration. Thus E = 2 ; the expected number of coin tosses is 2.
To prove that we can't do better than 2 coin tosses, notice that since p is not a dyadic fraction, we cannot put an upper bound on the number of coin tosses required. (If we can say that we return a result after n coin tosses, then consider the 2 n possible outcomes, all equally likely. Each of them is assigned to success or failure depending on its prefix. Thus the probability of getting a success is an integer divided by 2 n , which means the probability is a dyadic fraction, contradiction.) Thus after any number of tosses, at least one of the outcomes must require additional tosses.
After n tosses, at least one of the 2 n outcomes must require additional tosses, which means the probability that we obtain a result in n tosses is at most 2 n 2 n − 1 . After 1 toss, at most 1/2 chance we manage to obtain a result. After 2 tosses, it's 3/4; this means 1/4 of the time, we need at least 2 coin tosses. After 3 tosses, it's 7/8; this means 1/8 of the time, we need at least 3 coin tosses. Repeat this ad infinitum and sum over everything, so the expected number of tosses is at least
n = 1 ∑ ∞ 2 n 1 ⋅ n = 2 .
Since we reach the lower bound, we know our strategy is optimal. Thus the expected number of coin tosses is 2 = 1 2 , giving a = 2 , b = 1 , a + b = 3 .
Motivation for the strategy
First, we will show how to improve on that given strategy in the problem, as an inspiration for an optimal strategy.
Let's consider the expected number of tosses in the given strategy. If this number is E , then the expectation satisfies E = 3 + 8 3 ⋅ E : we need 3 coin tosses, and 3/8 of the time we need to repeat it. This is how we obtain E = 5 2 4 .
It's easy to show that from those 8 possibilities, if we pick 1 as a success, 4 as failures, and the remaining 3 to mean re-tossing, we have a 1/5 chance of success. This doesn't depend on which combinations that mean success/failure/re-toss, just how many . Thus, we can tweak things around. Let's assign all four THH, THT, TTH, TTT as failures instead. With this, we can see that a first toss of tails immediately indicates a failure, so we can actually skip the remaining two tosses.
We can try computing the same thing. This time, E = 2 1 ⋅ 1 + 2 1 ⋅ 3 + 8 3 ⋅ E : half of the time we only need 1 toss (the tails), the other half of the time we need 3 tosses. The 3/8 of the time of re-toss remains the same. This gives E = 5 1 6 , a significant improvement.
We also notice that an initial toss of HT is an immediate re-toss; we can do that, too. The equation becomes
E = 2 1 ⋅ 1 + 4 1 ⋅ ( 2 + E ) + 8 1 ⋅ 3 + 8 1 ⋅ ( 3 + E )
The first term is for tails in the first toss, the second term is for two tosses HT, the third term is for HHH, the fourth term is for HHT. This gives E = 5 1 4 . Getting pretty close, but still some way from our claim of 2. How can we improve on this?
One big problem lies on the contribution of 8 3 E in the right hand side. Can we simplify it? Yes, we can, if we go further to four tosses. We have 16 possibilities; we can set 3 of them to be successes, 12 to be failures, and that leaves a single result for a re-toss. That's a probability of only 1/16 to re-toss.
To be more precise:
This gives
E = 2 1 ⋅ 1 + 4 1 ⋅ 2 + 8 1 ⋅ 3 + 1 6 1 ⋅ 4 + 1 6 1 ⋅ ( 4 + E )
This simplifies to our desired E = 2 .
To the general strategy. Represent p in binary. Because p is not a dyadic fraction, this representation is infinite, so we don't need to care about the double representation 0 . 0 1 1 1 1 … 2 = 0 . 1 0 0 0 0 … 2 and such. Now, our coin tosses will represent a number q in binary as well, obtained bit by bit, by letting heads to be 0 and tails to be 1. Whenever we can decide with certainty that q < p (success) or q ≥ p (failure), we immediately stop tossing and return the result.
The strategy above is actually an adaptation of this. Since p = 5 1 = 0 . 0 0 1 1 0 0 1 1 0 0 1 1 … 2 , we have:
Now, this strategy actually already works. But we can simplify it a bit to obtain the form as presented above: double the probability and take only the fractional part, to represent us going further and further right of the binary point. Now, "if the corresponding bit of p is 0 but we obtain tails (1), return failure" can be rephrased as "if p , after doubled, is less than 1, but we obtain tails (1), return failure", and similarly "if the corresponding bit of p is 1 but we obtain heads (0), return success" can be rephrased as "if p , after doubled, is not less than 1, but we obtain heads (0), return success". This gives the exact algorithm described above.