You walk into Martin Gale's Betting Room with an initial budget of
As usual, you can play the following game any number of times (if you have what it costs)
You decide that you will play until you have increased your money to , and then you will stop. Here, . Of course you will also have to stop if you lose all your money (i.e. you are ruined).
What is the probability that you are ruined?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Imagine a large number of copies of the given setup proceeding simultaneously. After each player plays their first round, the average money of all players stays the same, because each has an equal chance of winning or losing $1. Likewise, the average total of all players remains unchanged after each round, because those players which are still playing won't change the average (due to the 50-50 outcome) and those who are already finished won't either, since they are no longer playing. Since eventually all players will finish their respective game, letting p be the probability one is ruined:
p ⋅ 0 + ( 1 − p ) ⋅ n = k ⟹ p = n n − k