A coin has probability p of appearing heads, where p is a random number taken uniformly from [ 0 , 1 ] . The coin is tossed 1 0 times; all ten gives heads. What is the probability that the next toss is tails ?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
There is an interesting variant of this problem called Laplace's Rule of succession where the Bayesian approach is used to determine the probability that the sun will rise tomorrow given that it has risen for n days already.
Warning: some of what I say here is not rigorous, so I'll let a probability expert clean up the language. I think the idea is correct, though.
Let A p be the event of the coin having probability p and B be the event of ten heads in a row. Then Bayes' theorem gives P ( A p ∣ B ) = ∫ 0 1 P ( B ∣ A q ) P ( A q ) P ( B ∣ A p ) P ( A p ) = ∫ 0 1 q 1 0 d q p 1 0 d p = 1 1 p 1 0 d p . Then the probability of tails is ∫ 0 1 ( 1 − p ) P ( A p ∣ B ) = ∫ 0 1 1 1 ( 1 − p ) p 1 0 d p = 1 / 1 2 .
The idea is correct but I think d p should not be included in P ( A p ∣ B ) .
Problem Loading...
Note Loading...
Set Loading...
We solve this problem using Bayes' theorem . The key distributions are:
Prior distribution is clearly uniform distribution , U ( 0 , 1 ) with pdf π ( p ) = 1 .
Model distribution, the probability distribution for the data as collected given a particular value for the parameter, with pdf f ( x ∣ p ) , where x = ( x 1 , … , x n ) T is a vector notation for observations. Assuming each observations is i.i.d. , then f ( x ∣ p ) = f ( x 1 ∣ p ) ⋯ f ( x n ∣ p ) Here we have 10 observations, therefore the model distribution is f ( x ∣ p ) = p 1 0 Hence, by definitions, the pdf of joint distribution is f ( x , p ) = f ( x ∣ p ) ⋅ π ( p ) = p 1 0 and the pdf of marginal distribution is f ( x ) = ∫ f ( x , p ) d p = ∫ 0 1 p 1 0 d p = 1 1 1 The posterior distribution has pdf π ( p ∣ x ) = f ( x ) f ( x , p ) = 1 1 p 1 0 Thus, the probability that the next toss (next observation, y ) is tail given that the previous observations is all heads is P r [ y = tail ∣ x = 10 heads ] = ∫ 0 1 f ( y ∣ p ) ⋅ π ( p ∣ x ) d p = 1 1 ∫ 0 1 ( 1 − p ) p 1 0 d p = 1 2 1 ≈ 0 . 0 8 3 3 3