Continued from this note.
So far, we have examined the gambler's ruin problem on a small example. What can we say abut the more general case of starting out at some budget and deciding to cash out at some threshold ?
72% of people got this right.
You walk into Martin Gale's Betting Room with an initial budget of
As usual, you can play the following game any number of times (if you have what it costs)
You decide that you will play until you have increased your money to , and then you will stop. Here, . Of course you will also have to stop if you lose all your money (i.e. you are ruined).
What is the probability that you are ruined?
SPOILERS AHEAD. Please solve the problem before reading on.
There are the number of ways to tackle this, but the approach I'm going to take is to set up a system of linear equations. Let denote the probability we want to compute, namely the probability that, starting out at a budget of you will lose all your money before you ever hit .
Since it is pessimistic to talk about ruin, let's set up the system in terms of variables representing the probabilities of winning, i.e. the probability of cashing out at . In what follows, we will keep fixed, and so we will subscript the variables with only the budget.
For , let be the probability that you cash out, starting from an initial budget of . We want to compute . If you play the game once, then with probability 1/2 you increase your budget by 1 and with probability 1/2 you decrease it by one. And then you get to play again, as if you were starting with your new budget. Thus
This gives a system of linear equations in . Combining the first two of these we easily see that We will use this as a base case of an induction. Now suppose it is true that for all . Then we'll show it is also true for as follows: We know that Using the induction hypothesis to substitute for and we have But now, using the fact that , we have , and finally It follows that the probability of ruin is .
Now what about the question of how long you play for?
72% of people got this right.
You walk into Martin Gale's Betting Room with an initial budget of
As usual, you can play the following game any number of times (if you have what it costs)
You decide that you will play until you have increased your money to , and then you will stop. Here, . Of course you will also have to stop if you lose all your money (i.e. you are ruined).
How many games do you expect to play before you stop?
Again, setting up a system of linear equations solves the problem. This time the equations we have are - and - for where is the expected number of games starting at a budget of . It is not too hard to solve this and see that we get
To be continued...
Easy Math Editor
This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.
When posting on Brilliant:
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
\(
...\)
or\[
...\]
to ensure proper formatting.2 \times 3
2^{34}
a_{i-1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
There are no comments in this discussion.