Gambler's fallacy...?

A coin has probability p p of appearing heads, where p p is a random number taken uniformly from [ 0 , 1 ] [0,1] . The coin is tossed 10 10 times; all ten gives heads. What is the probability that the next toss is tails ?


The answer is 0.0833333333.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

3 solutions

Tunk-Fey Ariawan
Nov 17, 2014

We solve this problem using Bayes' theorem . The key distributions are:

Prior distribution is clearly uniform distribution , U ( 0 , 1 ) \mathcal{U}(0,1) with pdf π ( p ) = 1 \pi(p)=1 .

Model distribution, the probability distribution for the data as collected given a particular value for the parameter, with pdf f ( x p ) f(\mathbf{x}|p) , where x = ( x 1 , , x n ) T \mathbf{x}=(x_1,\ldots,x_n)^{T} is a vector notation for observations. Assuming each observations is i.i.d. , then f ( x p ) = f ( x 1 p ) f ( x n p ) f(\mathbf{x}|p)=f(x_1|p)\cdots f(x_n|p) Here we have 10 observations, therefore the model distribution is f ( x p ) = p 10 f(\mathbf{x}|p)=p^{10} Hence, by definitions, the pdf of joint distribution is f ( x , p ) = f ( x p ) π ( p ) = p 10 f(\mathbf{x},p)=f(\mathbf{x}|p)\cdot\pi(p)=p^{10} and the pdf of marginal distribution is f ( x ) = f ( x , p ) d p = 0 1 p 10 d p = 1 11 f(\mathbf{x})=\int f(\mathbf{x},p)\,dp=\int_0^1 p^{10}\,dp=\frac{1}{11} The posterior distribution has pdf π ( p x ) = f ( x , p ) f ( x ) = 11 p 10 \pi(p|\mathbf{x})=\frac{f(\mathbf{x},p)}{f(\mathbf{x})}=11p^{10} Thus, the probability that the next toss (next observation, y y ) is tail given that the previous observations is all heads is P r [ y = tail x = 10 heads ] = 0 1 f ( y p ) π ( p x ) d p = 11 0 1 ( 1 p ) p 10 d p = 1 12 0.08333 \begin{aligned} {\rm{Pr}}[y=\text{tail}|\mathbf{x}=\text{10 heads}]&=\int_0^1 f(y|p)\cdot \pi(p|\mathbf{x})\,dp\\&=11\int_0^1 (1-p)p^{10}\,dp\\&=\frac{1}{12}\approx0.08333 \end{aligned}

Omkar Kamat
Dec 24, 2014

There is an interesting variant of this problem called Laplace's Rule of succession where the Bayesian approach is used to determine the probability that the sun will rise tomorrow given that it has risen for n days already.

Patrick Corn
Sep 8, 2014

Warning: some of what I say here is not rigorous, so I'll let a probability expert clean up the language. I think the idea is correct, though.

Let A p A_p be the event of the coin having probability p p and B B be the event of ten heads in a row. Then Bayes' theorem gives P ( A p B ) = P ( B A p ) P ( A p ) 0 1 P ( B A q ) P ( A q ) = p 10 d p 0 1 q 10 d q = 11 p 10 d p . P(A_p|B) = \frac{P(B|A_p)P(A_p)}{\int_0^1 P(B|A_q)P(A_q)} = \frac{p^{10} dp}{\int_0^1 q^{10} dq} = 11p^{10} dp. Then the probability of tails is 0 1 ( 1 p ) P ( A p B ) = 0 1 11 ( 1 p ) p 10 d p = 1/12 . \int_0^1 (1-p) P(A_p|B) = \int_0^1 11(1-p)p^{10} dp = \fbox{1/12}.

The idea is correct but I think d p dp should not be included in P ( A p B ) P(A_p|B) .

Tunk-Fey Ariawan - 6 years, 6 months ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...