Unexpected Probability Problem!!

You have 2 fair coins with you. You do the following process with them.

  • You flip the f i r s t \color{#D61F06}{first} coin and note its outcome. Now, you put away the f i r s t \color{#D61F06}{first} coin.

  • Now, you take the s e c o n d \color{#20A900}{second} coin and flip it again and again till you encounter a t a i l \color{#3D99F6}{tail} . Once you encounter it, you put away the s e c o n d \color{#20A900}{second} coin.

  • Now, if the outcome of the f i r s t \color{#D61F06}{first} coin was t a i l \color{#3D99F6}{tail} , you stop the process. Else, you start from first step.

Find expected value of the number of times you'll flip the s e c o n d \color{#20A900}{second} coin.


All of my problems are original .


Difficulty: \dagger \dagger \dagger \color{grey}{}\dagger \dagger


The answer is 4.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

2 solutions

Aryan Sanghi
Jun 28, 2020

Let's find the expected number of times you'll flip the s e c o n d \color{#20A900}{second} coin once you pick it up.

E 1 = (number of times it is flipped) × (probability that it’s flipped that number of times) E_1 = \sum \text{(number of times it is flipped) × (probability that it's flipped that number of times)}

E 1 = 1 × ( 1 2 ) 1 + 2 × ( 1 2 ) 2 + 3 × ( 1 2 ) 3 E_1 = 1 × (\frac{1}{2}) ^ 1 + 2 × (\frac{1}{2}) ^ 2 + 3 × (\frac{1}{2}) ^ 3 \ldots \infty

E 1 = 2 \color{#3D99F6}{\boxed{E_1 = 2}}


Let's find the expected number of times you'll flip the s e c o n d \color{#20A900}{second} coin in total considering f i r s t \color{#D61F06}{first} coin also.

E 2 = (number of times first coin is flipped) × (probability that it’s flipped that number of times) × E 1 E_2 = \sum \text{(number of times first coin is flipped) × (probability that it's flipped that number of times)} × E_1

E 1 = 1 × ( 1 2 ) 1 × E 1 + 2 × ( 1 2 ) 2 × E 1 + 3 × ( 1 2 ) 3 × E 1 E_1 = 1 × (\frac{1}{2}) ^ 1 × E_1+ 2 × (\frac{1}{2}) ^ 2 × E_1 + 3 × (\frac{1}{2}) ^ 3 × E_1\ldots \infty

E 2 = 2 × E 1 E_2 = 2 × E_1

E 2 = 4 \color{#3D99F6}{\boxed{E_2 = 4}}

Mark Hennings
Jun 29, 2020

Let N N be the number of times the first coin is tossed, and let Y j Y_j be the number of times the second coin is tossed after the j j th toss of the first coin. Thus the total number of tosses of the second coin is the random variable Z = j = 1 N Y j Z \; = \; \sum_{j=1}^N Y_j Suppose that the probability of the first coin coming up tails is p 1 p_1 , and that the probability of the second coin coming up tails is p 2 p_2 . Then N N has the geometric distribution G e o ( p 1 ) \mathrm{Geo}(p_1) , while each of the random variables Y j Y_j have the geometric distribution G e o ( p 2 ) \mathrm{Geo}(p_2) , and the random variables N , Y 1 , Y 2 , . . . N,Y_1,Y_2,... are independent of each other.

If N = n N = n then Z = Y 1 + Y 2 + + Y n Z = Y_1 + Y_2 + \cdots + Y_n , and so E [ Z N = n ] = E [ j = 1 n Y j ] = j = 1 n E [ Y j ] = n p 2 1 E[Z|N=n] \; = \; E\left[\sum_{j=1}^n Y_j\right] \; = \; \sum_{j=1}^n E[Y_j] \; = \; np_2^{-1} for all n 1 n \ge 1 . Thus we deduce that E [ Z N ] = p 2 1 N E[Z|N] = p_2^{-1}N , and hence E [ Z ] = E [ E [ Z N ] ] = E [ p 2 1 N ] = p 2 1 E [ N ] = p 1 1 p 2 1 E[Z] \; = \; E\big[E[Z|N]\big] \; = \; E[p_2^{-1}N] \; = \; p_2^{-1}E[N] \; = \; p_1^{-1}p_2^{-1} In the case p 1 = p 2 = 1 2 p_1=p_2=\tfrac12 we obtain E [ Z ] = 4 E[Z] = \boxed{4} .

Excellent solution sir. Thanku for sharing it with us.

Aryan Sanghi - 11 months, 2 weeks ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...