French mathematician Joseph Bertrand posed the following problem in 1889:
You have 3 identical boxes, each containing 2 coins: the first box 2 gold coins, the second box 2 silver coins, and the third box 1 gold and 1 silver coin.
Your friend shuffles the boxes at random. Then, you choose a box and pull a coin out of it, and it's gold.
What is the probability that the other coin in the same box is also a gold coin?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
OMG yes! That is an excellent way to see it: it doesn't matter that the coins are spread over two boxes... you might as well imagine the contents being in a single box :p.
Log in to reply
I disagree with the 2/3 answer. Without applying all of the probability math that others have posted, the answer seems clear. If I pull a gold coin out of a box, that means I cannot have selected the box with two silver coins. Therefore, my odds of pulling another gold coin out of the same box are 50/50: The 2nd coin in the box is either the other gold coin, or its the silver coin from the box with one gold one and one silver coin.
Log in to reply
You are correct that you eliminate one box with that first draw. But the information that you drew a gold coin also makes it more likely that you drew from box 1 than box 3.
I must be missing something; the problem clearly states that the possibilities have been narrowed down to ONE box, with one coin in it. It is either gold or silver. Therefore, one chance in two, or ½. Why am I wrong, please?
Log in to reply
The gold coin might be from GG box and it can be either 1st coin or 2nd : 2 possibilities in which second coin is always gold
And 1 possibility of GS box in which second coin is silver
Therefore, 2/3
I agree with you. Once you have drawn a gold coin it could only have come from either of two boxes.
Log in to reply
I posted this example in a couple other comments:
Let's say there are two boxes. One contains 99 gold coins and one silver coin. The other contains 99 silver coins and one gold coin. You pull a gold coin from a random box. If you pull another coin from the same box, is it more likely to be gold or silver?
I too strongly agree with you.... Ans is 1/2... I am not getting...how it should be 2/3...
Manuj Rajpal has explained it perfect. But if you don't see it yet, it can be helpful how I see it: you have more probability of taking a gold in the first place from a box with two gold coins than from a box with only one. So, if your first pick is gold, you are more probably picking from the box with a second gold coin than from the box with a second silver coin (and only one gold coin. EDIT: "with a second silver coin"... which you could have taken in the first draw, but was not the case. So, it's not trivial that the first coin is gold).
Log in to reply
Yes, but the initial probability doesn't count; it doesn't matter how we got there, we have a new starting situation with two possible outcomes, the second coin is either gold or silver, and both have the same chance. Or if you prefer, once you have taken a gold coin, you your box was either the box with two gold coins or the box with one gold and one silver,, again 1 in 2 chances that you chose the box with 2 gold coins.
Log in to reply
@Joaquin Llorente – I'm not talking about the first probability, I'm talking about an intuitive way to see why is higher (2/3) than what (a false) rationality tells us (1/2). This is conditional probability, so the events have to be somehow related, they are not totally independient. You are despising the info that you get from your first take: that you took a gold coin. This is not a trivial information. Yes, you have two possible outcomes, but they are not equally probable. This is the point.
Elegant solution
Can i ask a question? Is it possible we can see this like how we solve the Monty Hall problem? If you picked a gold coin, isn't that equivalent to eliminating the surely incorrect box?
Log in to reply
Yes I thought it felt similar, until it used the probability “tree” In this problem, there’s 6 branches, all equally likely, so equal chance of selecting G or S coin
In the MHP, you end up with an equal number of win/loose branches, but not all branches are equally likely.
But there may be another way to map this problem to the MHP, if so, I hope someone posts it
Yes, it's equivalent in that sense. Once you pick a gold coin, you've indeed eliminated that box. But the other two boxes are not equally likely to have been picked.
I agree with you both. In the Monty Hall problem your host knows what is the door he has to open in order not to show you the good one. But here you don't have that information, thus you don't know in which one of the other boxes are the two silver coins either. So you have to decide with the only information that you get from picking the very first coin. (If it were silver, and you can change your next box of picking, you better change to any of the other boxes because that way you would have a higher probability of taking a gold coin in your next move).
You draw 1st gold coin from box containing two gold coins. Gold coin remains in box. You draw 2nd gold coin from box containing two gold coins. Gold coin remains in box. -??? THE SECOND BOX SHOULD BE SILVER IF FIRST BOX IS FULL OF GOLD????? EXPLAIN You draw gold coin from box containing one gold coin and one silver coin. Silver coin remains.
SO THE PROBABILITY IS 1/2?
I agree with the 2/3 solution but i disagree with your shortcut method. The coins must be thought of in different boxes. This is since if I had infinitely many silver coins + 1 gold coin in a box vs 2 gold coins in the other box it is clear the answer to the question would change to a probability of 1. Howeve, with your “shortcut” it would be a probability of 0.
I think somewhere there is a misunderstood...! “ Question: What is the probability that the other coin in the same box is also a gold coin? “ In other words, the question is what are the chances for the next coin to be gold? Basically you have two options... the remaining coin it will be gold or silver, it can’t be anything else. That means it is 50-50 chance 1/2 I don’t know where you find the third one! 🤔
Log in to reply
There are, indeed, two possibilities. But those possibilities are not equally likely!
Log in to reply
Those possibilities are equally likely, but not for such a simplistic reason. One box contains 2 gold coins, one contains 2 silver, and one contains One gold and one silver. The possibility of pulling out a gold coin only exists in 2 boxes, so our denominator is 2. In one box, the other coin must be silver, and in the other, it must be gold. the probability equals exactly 50%-----1/2
Log in to reply
@Jerry Braun – The same information you used to rule out one box (the one with two silvers) also tells you that there's a 3 2 probability that we selected the box with two gold coins at the outset.
Please update the description of the question.
If one gold coin is removed, the second coin is simply either gold or silver.
Probability is 1/2.
Log in to reply
yep, agree
Totally agree.
Also, the three boxes are not identical whether they are all closed or open.
Different coins in each box are radically different.
The two coins are same in the gold coins box. Then why are we taking different possibilities to calculate the probability.
Once you have taken a gold coin, the remaining coin can only be gold or silver, so the probability that the second coin is gold is 1/2.
Log in to reply
In the same way, since it always either rains or doesn't rain, there should always be a 50% chance of rain.
I also think the chance the second coin is gold is 50%
P ( two gold coins ∣ first coin gold ) = P ( first coin gold ) P ( first coin gold ∣ two gold coins ) P ( two gold coins ) = 2 1 1 × 3 1 = 3 2 Bayes’ theorem
I love it, but I still have no clue why :p. (I thought the answer was 1/2, after all... after the first coin is gold you only know that you have either the GG box, or the GS box... seemed to me that the chance for either is 50% after that, but I guess that is my mistake :/). Hmm, I guess that the chance of having the GG box is twice as large (since it has 2 gold coins, out of the 3 gold coins in the GG + GS boxes)... So yeah, the chance then would be 2/3.
Log in to reply
Your reasoning in the parenthetical is good! There are three gold coins in play -- two of them are in the GG box and one is in the GS box.
Log in to reply
I don't understand how the 2/3 answer is possibly correct. You have pulled a coin from a box and it is gold. You know you have one of two boxes -- the GG box (which has a single gold coin left in it) or the GS box (which has a single silver coin left in it). You have a 0.5 probability that the second coin in that same box will be gold.
Log in to reply
@Dan G – Ah, even though you know you have one of two boxes, it's not equally likely that you have each box. Knowing you pulled a gold coin, is it's more likely you got the GG box. Out of the three gold coins, two of them are in the GG box.
Look this way: What is the probabilaty for G coin being from the GG box. In the experiment each Coin has the same chance to be drawn. Since the GG box has 2 G coin and GS box 1 G coin you get 2 of 3.
Log in to reply
It seems to me that the way the question is posed rules out the box with two silver coins as well as one other box. Only the box from which one gold coin has been taken remains in play. It has one coin in it, either gold or silver. In order for the probability to include six coins, 3 gold and 3 silver, the proposition would need to be stated quite differently
I disagree, for I believe the question is what are the chances that the second coin is gold and for that matter it does not consider the first coin, for it has already being determined as being gold, and, of course, there are only two possibility, for the box chosen was either the box 01 or the box 03 as the box 02 has no gold coins in it, thus it is now out of the equation. So, if it is the box 01 (2 x gold) the last coin is gold and if it is the box 03 the last coin is silver. Quite simple, unless there is some crossing in the question.
Log in to reply
You're right that there are two possibilities: box 1 and box 3. But the two possibilities are not equally likely! Think of it this way: you pick a box at random and pull out a gold coin. What is the probability it came from box 1? What about box 3?
Log in to reply
We should not take into consideration the 1st event (drawing one gold coin) the odds of that are irrelevant we are trying to find out the probability exclusively for the second event. We are trying to see what the chances of drawing a second gold coin are AFTER ALREADY DRAWING A FIRST GOLD COIN. That means there are 2 paths _silver _gold What happened before is irrelevant. Shouldn't it be like an unbiased event where the probability of a previously drawn coin does not effect the probability of this event?
Log in to reply
@Sarthak Shrivastava – What happened before is relevant because it tells us some information about what box we chose. The fact that we picked a gold coin initially means it's more probable that we selected box 1 than box 3 at the beginning of the problem.
After one coin has being removed there is only one coin left and it is either gold or silver, so there is a 50/50 chance that it is either one, for there is a 50/50 chance that the box is the 01 box and, again, 50/50 chance that the box is the 03 box, for it can NOT be the 02 box, for the the 02 box has no gold coin in it. After you took one coin it is a new event, the event of the second coin, absolutely apart from the first event (although you can use its result to deduce the result and possibilities for next event, as you just proved it can not be the 02 box), in the second event there is only ONE coin in the box and it can be either silver or gold, thus, again 50/50. The 2/3 concept is a misleading concept that ignores the first result and ignores the fact that there is only one coin left. Thank you.
Log in to reply
@Ricardo Martinelli – Just because the remaining coin is silver doesn't mean the two possibilities are equally likely. While you're correct that it cannot be box 2, it's not a 50/50 probability that it's box 1 or box 3.
You know that on the first result you pulled a gold coin. If we stop the problem there, and ask you to pick which box you drew from, there's a 2/3 probability you drew from box 1 and a 1/3 probability you drew from box 3. From there, the second coin is determined.
Here's another example of the same principle: there are two boxes. One contains 99 gold coins and one silver coin. The other contains 99 silver coins and one gold coin. You pull a gold coin from a random box. If you pull another coin from the same box, is it more likely to be gold or silver?
Log in to reply
@Jordan Cahn – but you are talking about the event that is over, the first coin retrieved. The question relates only to the second and last coin and the two possible boxes (reduced to two by the fact that it is a gold coin), and as one picks from the same box it has now only one coin left that can only be gold or silver, thus 50/50.
The next example is totally different the chance of being box 2 is one in hundred, either way, if the second coin is silver, you'll need to pick a third coin to be sure about the box number.
T.Y.
Log in to reply
@Ricardo Martinelli – How about this? Box 1 has 100 gold, Box 2 has 100 silver, and Box 3 has 99 silver and 1 gold. You draw a gold coin at random. What box did you most probably draw out of?
It doesn't matter that the event already happened, you gathered some information that you didn't start the problem with. Just like you can use that information to rule out Box 2, you can use the information to determine that it's more probable you're in Box 1.
Shouldn't we look at only the probability of the second gold coin because of the way the question is framed? One of them has already been drawn so it is either the 1st or 3rd box. So probability is 1/2
Log in to reply
I used this example in another comment:
Let's say there are two boxes. One contains 99 gold coins and one silver coin. The other contains 99 silver coins and one gold coin. You pull a gold coin from a random box. If you pull another coin from the same box, is it more likely to be gold or silver?
Log in to reply
interesting. that will be in the true spirit of what bertrand postulated. Can you give the link to your comment here
Log in to reply
@Sunil Nandella – I'm not sure how to link to a specific comment. It was in response to Ricardo Martinelli above.
Log in to reply
@Jordan Cahn – ok. it is this question you posed. How about this? Box 1 has 100 gold, Box 2 has 100 silver, and Box 3 has 99 silver and 1 gold. You draw a gold coin at random. What box did you most probably draw out of? I started to get the idea of what you are saying. Can you support me with the exact probability (in numbers) of drawing the second gold coin in this case.
Log in to reply
@Sunil Nandella – If you draw a gold coin in that situation, there is a 1 0 1 1 0 0 chance that your gold coin came from Box 1. You can either see this as resulting from all gold coins being equally likely to be drawn, or by using Bayes's formula as I did in my solution to the original problem:
P ( 1 0 0 gold coins ∣ first coin gold ) = P ( first coin gold ) P ( first coin gold ∣ 1 0 0 gold coins ) P ( 1 0 0 gold coins ) = 3 0 0 1 0 1 1 × 3 1 = 1 0 1 1 0 0
Log in to reply
@Jordan Cahn – Excellent.that means answer to below is 99/100. One contains 99 gold coins and one silver coin. The other contains 99 silver coins and one gold coin. You pull a gold coin from a random box. If you pull another coin from the same box, is it more likely to be gold or silver? =(99/100)*1/2/(1/2).
Log in to reply
@Sunil Nandella – which is same for silver. both are same.
Log in to reply
@Sunil Nandella – Yes in that problem, pulling a gold means there's a .99 chance that the next coin is gold. Pulling a silver means there's a .99 chance that the next coin is silver.
Thanks, an excellent use of Bayesian probability in a simple case, very handy to refer to when I get confused with more complicated Bayesian problems.
Did anybody else hear their inner voice yelling "Monte Hall problem!" and still ended up convincing theirself it was 1/2?
Log in to reply
Me :/ tried to do bayes, failed, went with the simplest answer
I think somewhere there is a misunderstood...! “ Question: What is the probability that the other coin in the same box is also a gold coin? “ In other words, the question is what are the chances for the next coin to be gold? Basically you have two options... the remaining coin it will be gold or silver, it can’t be anything else. THAT MEANS IT IS 50-50 CHANCE ...1/2 I don’t know where you find the third one! 🤔
Nice approach! For those that do not agree with the solution, there is a code implemented in python to simulate this problem on the following link: https://colab.research.google.com/drive/1OeB8nhS5V8cE__kXVdEz3SZeFu-Y21vj
I've been getting the answer 1/2 the whole time too. But now eventually I realised why 2/3 is right.
Maybe my explanation helps you also to understand.
We have 3 boxes with each two coins. Box 1 Box 2 Box 3 GA / GB SC / SD GE / SF Each of the six coins can be equally chosen, their probability is 1/6. So if we pick GA the next pick would be GB and so on:
GA -> GB GB -> GA SC -> SD SD -> SC GE -> SF SF -> GE
We have picked one golden coin. So either GA or GB or GE, we don't know which one. Now if it's GA we will pick GB, if we picked GB we will pick GA and if we picked GE we will pick SF. So we have 2 out of three chances, which means the probability is 2/3.
There are six coins in total, all equally likely to be selected. Since you know a gold coin was taken, only three of the six possibilities remain. In two of these, the gold coins come from the same box. So, the probability that the other coin is also gold equals 3 2 .
Yes this is like drawing a probability “tree”, every branch branch equally likely, 1/6 Then elimate all branches ending in a silver coin, two branches come from the GG box, one from the GS box
You are twice as likely to pull a gold coin out of the first box than the third.
I like the simple explanation!
This is a incorrect explanation. If you pull one gold coin, you have a 1/2 chance of pulling a gold coin from the same box. Because you only have 2 boxes that contains a gold coin. So that is a 1/2 chance of pulling a gold coin out of the same box.
Log in to reply
You assume that each box is equally likely, but in reality, the box with two gold coins is more likely to be your box given that the coin you selected is gold. Use Bayes' Theorem for a more formal proof, as Jordan Cahn did.
Isn't that montyhall ?
> You are twice as likely to pull a gold coin out of the first box than the third.
There is no third box, there are only two boxes, because by pulling a gold coin out you reduce the problem to the next coin having an equal probability of being gold or silver.
Why even do these problems when there is not an undeniable proof of the answer that people can learn from ... this problem just confuses people, at least me. I think the problem poser failed to word this problem correctly to bring in the probability of first having to get to the 50:50 point.
The probability of picking a box with one gold coin is what is in question here, but given that a gold coin has been picked, the probability then of picking another gold coin is 1:2.
We had three boxes with coins given that we have already found one gold coin the problem reduces to finding the probability that a coin draw would result in gold when we had 1 silver and 2 gold coins because of that extra information.
Two thirds of the gold coins are in the box with two gold coins.
Once one gold coin is pulled out, there are two boxes in question. Since the odds of a gold coin being pulled at of a box with two gold coins is greater then a silver and gold box. The odds are over 50%. The only option over 50% is 2/3 so I picked that.
There are 2 types of coin, of which 3 are silver and 3 are gold.
There are 3 types of box All Gold (1g) All Silver (1s) Half Gold/Half Silver ((1/2)g + (1/2)s)
(1g) + (1s) + ((1/2)g + (1/2)s) = 3 Boxes
We can minus Box (1s) as can't be both silvers. We have picked a gold. We can see our remaining options are of two boxes, all gold and half Gold & half silver.
Which can be written in half boxes to illustrate we are yet to determine which ((1/2)g) we picked from what chest .
(2 (1/2)g) + (( 1 / 2)g + ( 1 / 2)s) = 2 Boxes left
Now we minus the gold coin in our hand ((1/2)g ) and are left with:
(1/2)g +(1/2)g +(1/2)s = 100% of what's left
All half quantities are equally likely (33.3%) as a ratio (2:1) two of them are gold to one silver.
Therefore there is a 66.6% probability the next coin is gold.
If anyone has trouble grokking this puzzle, here's another way to think of it.
Instead of the coins being in boxes, supposed that they are attached in pairs with long strings. You take all six coins, drop them in a hat, and stir so that the strings are too mixed up for you easily to see which ones connect. You grab one of the three gold coins. When you follow its string to the attached coin, what are the chances you expect to find another gold coin on the other end? Of the three gold coins, two of them connect to another.
The solution, as many posters have demonstrated, is 2/3. Thirty years on from the Monty Hall Problem we still have mathematicians denying basic probability. It's depressing.
I'm fairly new to this, so here's how i thought it through... there are TWO boxes with gold coins in. I've effectively chosen one of them. Now, after that fact (regardless of what the odds were before) i have chosen ONE of the TWO boxes. The odds that i chose the box with two gold coins would be 1/2 wouldn't it? and if that were the case, the chance that the other coin is gold must therefore be the same, no?
Log in to reply
I've managed to convince myself otherwise now...because there were two gold coins in one, i'm TWICE as likely to choose that one in the first place. See? Told you i was new to this :)
For the answer that you're "supposed" to get imagine playing this game 300 times. 150 of those times you end up with a gold coin. 100 of those situations occur in the box with two gold coins. Of course once you have chosen a box there is no probability involved. You either have a gold coin or you don't. So the correct answer to the question as given is 1 or 0. In this situation this is pedantry but poor phrasing leads to a lot of confusion in this subject and is a good way of hiding paradoxes for other questions.
I believe that the answer of 2/3 is wrong. After the box is chosen and there is at least 1 gold coin, then there is only 2 possible outcomes. either it is the box with 2 gold coins or the box with 1 gold coin. Since there are only 2 possible outcomes, and only 1 of them is correct, the correct answer is 1/2.
Log in to reply
That was the conclusion I came to as well. I don't understand the rationale of considering all 6 coins when only 4 of them pertain to the question being posed.
Log in to reply
Mark the golden coins in the box with 2 golden coins as nr 1 and nr 2, and the golden coin in the box with the silver coin as nr 3. Now you choose a box and pull a golden coin out. The coin pulled out is nr 1,2 or 3 with equal probability. Only with nr 3 the other coin is not gold.
Log in to reply
@Erwin Sg – You have already picked one of two boxes that contains a gold coin, the box with two gold coins, or the box with a gold and silver coin - you already have one of those two boxes. You seem to be somehow computing the original pick of "a box with at least one gold coin" into the problem when you consider it that way.
Log in to reply
@Bp Kline – You can only pick a final gold coin out of the GG box, but there are two ways this could have happened. The third gold coin in the GS box gives you the one way you can't finish with a gold.
I agree with you two .... it is a given that when you have pulled a gold coin out already ... the next decision is if the next coin is gold or silver ... and equal probability .... 50%. What other possibility is there?
Once the box is chosen there is only ONE possible outcome! :-)
I completely agree. There are only two boxes with a gold coin. There are only two possible outcomes and therefore the conditional probability is 1/2
Problem Loading...
Note Loading...
Set Loading...
There are three ways you can get to the position posed by the question, all equally likely: -
The second coin is drawn from the same box . . . two scenarios lead to it being gold, one scenario leads to it being silver.
Thus the overall probability that the other coin in the same box is also gold is 3 2 .
When I solved the problem in my head I took a shortcut. I visualized a single "black box" of all the information given: - One gold coin drawn from four coins, for I knew that the silver-only box must be discounted. This leaves two gold coins and one silver coin in my "black box". Thus, the chance that the next coin is gold is 3 2 .