Dying Particle Population

Imagine an exotic particle that decays into any number of identical particles i i , with i 0 i \geqslant 0 . So, for instance, when i = 0 i = 0 , the particle dissipates as energy; when i = 1 i = 1 , it's like nothing happens to the particle; when i > 1 i > 1 , the particle is effectively cloned with the help of the unlimited surrounding energy.

Now let K K be the number of a particle's offspring in the next step. Specifically, let K K be truncated Poisson variate with mean 3 2 \frac 32 and possible values 0 , 1 , 2 , 3 , 4 0, 1, 2, 3, 4 , which means P { K = k } = P { L = k L 4 } = P { L = k } P { L 4 } , \mathbb{P} \{ K = k \} = \mathbb{P} \{ L = k \, | \, L \leqslant 4 \} = \frac{ \mathbb{P} \{ L = k \} }{ \mathbb{P} \{ L \leqslant 4 \} }, where k k is between 0 0 and 4 4 and L L is an ordinary Poisson variate with mean 3 2 . \frac 32.

Let X n X_n be the number of particles in the n th n^\text{th} generation, with X 0 = 2 X_0 = 2 . What is the probability that this population of particles dies out eventually?


The answer is 0.192618.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

1 solution

Borut Levart
Dec 14, 2017

First note that the number of particles at time n + 1 n+1 is a function of the number of particles at time n n and some randomness, namely the sum of X n X_n instances of K K . This makes the process a Markov chain with states 0 0 through \infty .

So one can in principle approximate the result by setting up the transition matrix of a finite Markov chain with some cut-off value. Let's say that value is 9 9 , the number of particles from where it's practically impossible for the population to die out since there are too many breeders. That makes the states 0 0 and 9 9 the two absorbing states in this 10-by-10 \text{10-by-10} system with transition probabilities:

p 00 = 1 , p 99 = 1 p_{0 0} = 1 \, , \, p_{9 9} = 1 p i j = P { k = 1 i K k = j } ; 0 < i < 9 , 0 j < 9 p_{i j} = \mathbb{P} \{ \sum_{k = 1}^i K_k = j \} \, ; \, 0 < i < 9 \, , \, 0 \leqslant j < 9 p i 9 = P { k = 1 i K k 9 } ; 0 < i < 9 p_{i 9} = \mathbb{P} \{ \sum_{k = 1}^i K_k \geqslant 9 \} \, ; \, 0 < i < 9

Chance of absorption into state 0 0 from state 2 2 is 0.192422 0.192422 , which is correct to three decimal places.


One can calculate the answer more easily and precisely. Let us start with a single particle, so X 0 = 1 X_0 = 1 , and let's condition on the initial transition:

α = P { event D: population dies out X 0 = 1 } = j = 0 P { D X 1 = j , X 0 = 1 } p 1 j \alpha = \mathbb{P} \{ \text{event D: population dies out} \, | \, X_0 = 1 \} = \sum_{j = 0}^{\infty} \mathbb{P} \{ D \, | \, X_1 = j, X_0 = 1 \} \, p_{1 j}

Since each particle acts independently, the family, the dynasty if you will of the initial particle dies out eventually if all families of the first offspring die out eventually, so:

α = j = 0 4 α j p 1 j = p 10 + α p 11 + α 2 p 12 + α 3 p 13 + α 4 p 14 \alpha = \sum_{j = 0}^{4} \alpha^j \, p_{1 j} = p_{1 0} + \alpha \, p_{1 1} + \alpha^2 \, p_{1 2} + \alpha^3 \, p_{1 3} + \alpha^4 \, p_{1 4}

Summing boundary is brought down to accommodate K K . We are left with a polynomial with coefficients p 1 k = P { K = k } p_{1 k} = \mathbb{P} \{ K = k \} , and we are interested in the smallest positive root. Originally we have X 0 = 2 X_0 = 2 which means that not one, but two families should both die out eventually and that makes the final answer α 2 = 0.192618 \alpha^2 = \boxed{ 0.192618 } .

I don't know if you designed this problem. But if you did, I don't know if you did this on purpose, but the problem resists solution via least squares. Without recognizing that you can use powers of α \alpha , and just labeling each as different variables, after 42 equations (164 unknowns), the matrix becomes too singular, and you cannot see what the least squares solutions converge to. I also tried setting all the extra unknowns equal to zero, but that seems to converge to the wrong value, namely, 0.17826. Nice problem and solution! (+1)

James Wilson - 3 years, 4 months ago

Log in to reply

(Just so you know, I don't actually understand what you did with the Markov chain. The equation that you used that contains α \alpha would've been equation 1 of my system. Then I got the other equations by using the Poisson distribution for the sum of n random variables. However, there is no telling whether the least norm solution should converge to the correct solution, considering the system is always underdetermined. But it would be nice if you could use more digits to try and find out. By the way, I used Matlab.)

James Wilson - 3 years, 4 months ago

This is an old problem (Wikipedia: Galton–Watson process), I dressed it up. Markov chain moves between states 0...9, representing size of the current population.

Borut Levart - 3 years, 4 months ago

Log in to reply

Nice, thanks for sharing!

James Wilson - 3 years, 4 months ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...