Imagine I offer two kinds of games to you; let's call them Game A and Game B respectively.
In Game A, you bet that a coin I'm going to toss turns up heads; the payout is even (you gain if you win but you lose if you lose). But I don't play fair; the probability that the coin will come up heads is just . Clearly this is a losing game, no? You won't want to play this game.
In Game B, you bet that a coin I'm going to toss turns up heads; the payout is even (you gain if you win but you lose if you lose). Unlike the above, I now have two coins! One coin was that unfair one above, turning up heads with probability , but another one is unfair to your favor, turning up heads with probability . So how do I select which coin to use? I will look on the previous game you played; if you won that, I'll use the bad coin (probability of heads ), but if you lost that, I'll be generous and use the good coin (probability of heads ). For ease of discussion, suppose that in the first game you get the good coin; turns out this doesn't matter.
Let's analyze Game B. Suppose in the long run, the probability of winning Game B is , and the probability of losing it is . Then we get the following system:
The first equation is simply "P(win this game) = P(won last game) x P(bad coin gives heads) + P(lost last game) x P(good coin gives heads)", and the second equation is similar but with losing chances instead. The third equation follows from the total probability; either you win or lose this game, so the sum of the probabilities is . When we solve the system, we obtain that . Thus on average you will only win of but lose of games. Since the payout for a win is equal to the cost of a loss, in the long run you will lose every games, so this is a losing game.
But what happens when you play Game A and Game B alternately, in the sequence ?
Observe that all Game B results only depend on Game A, so we might as well create a new game called Game C which plays one Game A followed by one Game B, and our sequence becomes . Let's analyze Game C.
There are essentially four outcomes in Game C:
Computing the expected gain from Game C, we obtain . We have a positive expected value! And since every instance of Game C is independent, playing Game C repeatedly will simply magnify the expected gain. Thus the sequence is winning! But this is the sequence , made up of two losing games! How can we obtain a game that is winning from two losing games?
This is also known as Parrondo's paradox. By playing two losing games in some sequence, you might be able to make it into a winning game!
Note that the two games must be dependent in some way. In the above, Game B depends on the result of the previous game, and this is the trick: we make sure that Game B depends on Game A instead of another Game B. We computed that Game B is losing, but only if the previous game is another Game B; if it's Game A (or some other game, like a "you always lose" game), then Game B is winning. You can prove that if the games are independent, then playing them in any sequence will eventually lose.
Easy Math Editor
This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.
When posting on Brilliant:
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
\(
...\)
or\[
...\]
to ensure proper formatting.2 \times 3
2^{34}
a_{i-1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
I believe that "but another one is unfair to your favor, turning up heads with probability 0.7" should be "fair" instead?
Here's a simpler way to explain the paradox:
There is a light in the room, which could randomly turn on or off every minute. Game A: Check if the light is on. If it is on, you gain $10. If it is off, you lose $20.
Game B: Check if the light is off. If it is off, you gain $30. if it is off, you lose $40.
Each minute, you can choose if you want to play Game A or B, and can decide to play Game B (or A) even after you play Game A (or B) and the result is reveals.
What is the maximum expected value of playing this game smartly?
Of course, with this explanation, it becomes much clearer how the dependence would lead from a negative expected value, to a positive expected value. This helps make it easier to think about it.
Log in to reply
A fair coin turns up heads exactly 0.5 of the time. The one that turns up heads 0.7 of the time is unfair (biased), but to the player's favor.
The thing with my example is that you decide the sequence beforehand, while your example decides the sequence on the go. In your game, the bulk of the game relies on smartly choosing the sequence, and thus positive expected value can be attributed to correct selections; in my game, you can't choose the sequence any more and must accept whatever fate gives you from the coin tosses, so positive expected value must come from fate.
Log in to reply
Ah yes you are right, i wasn't reading it right when i made the first statement.
My point is that when trying to explain a paradox, it is best to strip away the "fancy details" and boil down to the essentials. In this case, having the convoluted setup makes it harder for someone to follow through, and they might end up thinking "hm, this game is intentionally tricky and that's the reason why I was wrong", as opposed to "oh, now I see the reason for the paradox".
I understand what you are saying in terms of the difference of the game. The distinction boils down to how you encode "So how do I select which coin to use?", as explained by "two games must be dependent in some way".
Game A:
Check if light is on. If light is on, you get $10. If light is off, you lose $20.
Game B:
If you won Game A: Check if light is on. If light is on, you win $0, if light is off, you lose $10.
If you lost Game A: Check if light is off. If light is off, you win $30, if light is on, you lose $40.
Now, it is pretty clear that no matter if the light is on or off, if you play the game AB, then you will always get $10.
Here's a follow up question. How can one use this idea to make money in the real world?
Interestingly, there are lots of examples in trading. The idea is that Game A gives you significant information, which is why you are willing to pay (lose money) for it, and then make it back and more in Game B.
Log in to reply
Wikipedia gave the following example:
Game A: You lose $1 every time you play.
Game B: If your money is even, win $3. Otherwise lose $5.
That is a simple example too, where if your initial money is odd, playing AB gives you $2 each time. I tried to find an example that doesn't depend on how much money you have; turns out it's a hard thing or it eludes me for the moment.
This is very interesting.
Why don't you post a wiki article on it? Wikis are more permanent than notes.
Log in to reply
...because I'm not used to wikis yet. Let's see...
Oh what subject would this be on? Is there a Paradoxes section?
Log in to reply
Even if there's none, this fits in some probability section, about playing games of chance and such.
I would love to create a paradoxes section. Simply get started with "Post Something - Wiki", and we will add them to the relevant sections.
There is no super-secret. It is important to choose the games that you really understand, for example, CS GO. If you have extensive experience in this online game, then you can go to the betting niche at https://csgobettings.com/csgo-casino/. And make money thanks to all the acquired gaming skills in the past. And it will no longer seem like mere leisure. This will be the game that you won because you learned all the nuances.