Suppose you start off with $1, and you play a (binary) game whereby you win with a probability of . If you win, you gain $1; if you lose, however, you lose a dollar. Let's say you win this game; now you have $2. Then, split the $2 into two lots of $1; you play this game again, but now you play it twice (since you have $2). This process is repeated ad infinitum; in other words, whenever you have $, you split it into lots of $1 and play instances of the game. Note that at the end of each step, you lump up your winnings and start over the process again; the process end when you are left with $0.
Here are some questions that may be of some interest for you:
How to calculate the probability of having $ at a certain point in (discrete) time?
What is the long-term probability of running out of money or ending up with infinitely many money?
Does this long-term probability change when you vary the probability of winning the game? How about if we varied the starting amount?
Answering these questions may provide a sturdy bridge by which branching processes may be studied; it has been a problem that has eluded me for years, and I have moved on to more greener pastures at this stage. Although I would like to return to it, I post this here to garner some interest in this problem.
Easy Math Editor
This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.
When posting on Brilliant:
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
\(
...\)
or\[
...\]
to ensure proper formatting.2 \times 3
2^{34}
a_{i-1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
Some findings: The expected amount of money at any stage is $1. (not surprising)
The probability of ending up at 0 in n steps or fewer are given by https://oeis.org/A187131 / 2^(2^(n-1)).
The numerators for the cumulative probabilities of ending up at 0 are https://oeis.org/A167424 See the examples for the probabilities themselves. Not surprisingly, these fractions are tending to 1. You will eventually run out of money.
There is zero probability of infinite money, but there is a positive probability of ever exceeding any finite amount.
After 4 rounds, here is the probability distribution, all out of 2^15
Define a branching process where each node has 0 child w.p. 21 (which corresponds to losing that particular game), and 2 children w.p. 21 (which corresponds to winning that particular game). The long-term probability of running out of money is equal to the probability of extinction of this particular branching process, which is given by the smallest non-negative root of the fixed point equation z=21+21z2, which is 1. Hence, we will run out of money eventually almost surely.
Keeping the probability of winning pw the same (i.e., 1/2), for any finite amount of starting money, we still run out of money eventually almost surely. For any general pw, this probability may be obtained by examining the smallest root of the following fixed point equation z=pwz2+(1−pw) which is α=min(1,pw1−pw).
If we had initially started with $N, probability of running out of money eventually becomes αN.