There are 2 buckets and 100 balls. A man throws the balls one at a time and it always falls in one of the two buckets. At which point(how many balls thrown) the difference between the number of balls in the two buckets is statistically likely to be the greatest?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
When you throw 1 ball the difference is always 1 so - 1; For 2 - 1// 3 - 1.5// 4-1.5// //5-1.86(prox) calculating for all cases 5-0 3-2 4-1 and their probability(For example the probaility of 3- 2 is 20/32 so 20/32*1(the difference)
// 6-1.87(prox) and so on.It is a growing function but when the number is even its drops a bit , because of the cases where the balls split in two equal parts and that is why is 99 and not 100
If you liked the challenge you can try the task with 3 buckets - and when the sum of 2 buckets is likely to be closest to the 3rd bucket.(1000 balls)