Let S = 0 .
Pick a random real number k from a uniform distribution on [ 0 , 1 ] . Then add k to S : if S < 1 then repeat the process and add another number. If S ≥ 1 then the process ends.
What is the expected value of S ?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Could you tell me what is E and P here? These signs are completely new to me.
I think the second equation is not correct. If n = 1 0 , then E [ S ∣ N = 1 0 ] = 5 . But, it does not make sense that E [ S ∣ N = 1 0 ] > 2 . I guess the problem lies in the fact that N is not independent from X j . For example, if X 1 = 0 . 9 9 , then it is very likely that N = 2 .
Log in to reply
What we are doing here is a rather delicate separation of the process into two parts. Remember that S = S N .
We could have a set-up where N is equal to the value I obtain when rolling a 6 -sided die Then it is still true (for example) that E [ S N ∣ N = 2 0 ] = E [ S 2 0 ] = 1 0 for example, even though S = S N cannot exceed 6 . The point is that what is actually possible is strictly controlled by the probability distribution of N , which handles the different conditional expectations. In the above example, the probability that N = 2 0 is 0 , which means that the counter-intuitive conditional expectation does not contribute.
Log in to reply
I agree with your example. But, in the case of this example, the rolling of the die is independent from the sequence { X j } j . However, in the case of the problem this does not hold. In fact, I have found that for n ≥ 2 , E [ S ∣ N = n ] = 1 + 2 ( 1 + n ) n . This also leads to the correct result, since E [ S ] = n ≥ 2 ∑ E [ S ∣ N = n ] P ( N = n ) = n ≥ 2 ∑ 2 ( n + 1 ) ! ( n − 1 ) ( 2 + 3 n ) = 2 e .
Log in to reply
@Plinio Sd – You are right. It means that things are more complex algebraically. I have modified my proof to suit.
Let the process stop after the n th time .
The condition for this is we pick x 1 , x 2 . . . . . . . . . . x n and x n + 1 such that x 1 + x 2 + . . . . . x n < 1 and x 1 + x 2 + x 3 + . . . . . x n + x n + 1 > 1 .
Let A denote the event that x 1 + x 2 + . . . . . . . . x n < 1 and B denote the event that x 1 + x 2 + x 3 + . . . . . + x n + x n + 1 > 1 .
We need P ( A ∩ B ) in order to calculate expectation.
P ( A ∩ B ) = P ( A ) ⋅ P ( B ∣ A ) . [Here P ( B ∣ A ) denotes the probability of event B given that event A has happened] .
So let us calculate P ( A ) .
The probability of this event is given by Volume of unit Hypercube in n-dimension Volume of a standard n-simplex .
The volume of the standard n-simplex can be found using calculus.
∫ 0 1 ∫ 0 1 − x n ∫ 0 1 − x n − 1 − x n . . . . . . ∫ 0 1 − x 3 − . . . . . . − x n − 1 − x n ∫ 0 1 − x 2 − x 3 − . . . . . . − x n − 1 − x n d x 1 d x 2 d x 3 . . . . . . d x n − 1 d x n = n ! 1 n = n ! 1 .
Hence P ( A ) = n ! 1
Now to find P ( B ∣ A ) let us call ( x 1 + x 2 + . . . . . . . . x n ) = z .
So the P ( B ∣ A ) is identical to finding the probability of P ( z + x n + 1 ) < 1 where z and x n + 1 are uniformly and randomly selected from ( 0 , 1 ) which is an easy two dimensional problem. We can even apply basic geometry to solve it . It would be the area of the triangle formed by the line x + y = 1 and the coordinate axis . But let us do it by the above simplex concept as a 2 -simplex is just a line.
P ( B ∣ A ) = ∫ 0 1 ∫ 0 1 − x n + 1 d z d x n + 1 = 2 1 .
Hence P ( A ∩ B ) = 2 ⋅ n ! 1
Hence the expectation E ( S ) is given by E ( S ) = n = 1 ∑ ∞ n ⋅ P ( A ∩ B ) = n = 1 ∑ ∞ 2 ⋅ n ! n = 2 1 n = 1 ∑ ∞ ( n − 1 ) ! 1 = 2 e .
Useful wiki :- Simplex .
Problem Loading...
Note Loading...
Set Loading...
I am grateful to Plinio Sd for pointing out a gap in the previous version of this proof.
Let X n be the value of the n th random variable, and let S n = j = 1 ∑ n X j be the sum of the first n of them. Let N be the number of RVs that are needed to make the sum greater than 1 , so that the random variable we want is S = S N . So long as 0 < x ≤ 1 , P [ S n < x ] = ∫ 0 x P [ S n − 1 < x − y ] d y so a simple induction gives P [ S n < x ] = n ! x n 0 < x ≤ 1 Thus P [ N = n ] = [ S n − 1 < 1 ≤ S n ] = ( n − 1 ) ! 1 − n ! 1 = n ! n − 1 n ≥ 2 Although we do not have the full probability density function f n for S n it is clear that f n ( x ) = ( n − 1 ) ! x n − 1 0 ≤ x ≤ 1 Moreover P [ S ≤ 1 + x a n d N = n ] = ∫ 0 1 f n − 1 ( y ) P [ 1 − y ≤ X n ≤ 1 + x − y ] d y = ∫ 0 x f n − 1 ( y ) P [ 1 − y ≤ X n ≤ 1 ] d y + ∫ x 1 f n − 1 ( y ) P [ 1 − y ≤ X n ≤ 1 + x − y ] d y = ∫ 0 x f n − 1 ( y ) y d y + x ∫ x 1 f n − 1 ( y ) d y = ( n − 1 ) ! x − n ! x n and hence E [ S ∣ N = n ] P [ N = n ] = ∫ 0 1 ( 1 + x ) d x d [ ( n − 1 ) ! x − n ! x n ] d x = ∫ 0 1 ( 1 + x ) [ ( n − 1 ) ! 1 − ( n − 1 ) ! x n − 1 ] d x = 2 ( n + 1 ) ! 3 n 2 − n − 2 for any n ≥ 2 , and hence E [ S ] = E [ E [ S ∣ N ] ] = n ≥ 2 ∑ E [ S ∣ N = n ] P [ N = n ] = n ≥ 2 ∑ 2 ( n + 1 ) ! 3 n 2 − n − 2 = 2 1 n ≥ 2 ∑ [ ( n − 1 ) ! 3 − n ! 4 + ( n + 1 ) ! 2 ] = 2 1 e making the answer 1 . 3 5 9 1 4 0 .