n = 1 ∑ ∞ n f ( n )
Let S denote the expression of the variable sum above where f ( n ) has a 50% chance of being 1 and a 50% chance of being -1 for each individual n , (with the f ( n ) 's being independent of one another).
What is the probability that the series S converges?
This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try
refreshing the page, (b) enabling javascript if it is disabled on your browser and,
finally, (c)
loading the
non-javascript version of this page
. We're sorry about the hassle.
Sir how should we start studying infinite series with random variables {I think its really cool to replace nth terms by random variables :D :p }... I went through the wiki article but could get nothing out of it {theres a lot of internet stuff but i can't understand them}....... can u guide me through the basic materials .
Log in to reply
When I first wondered about this concept (some time ago), I stumbled across this paper , which I found to be an excellent introduction. I suppose I'm being lazy, but there is no way I could match the clarity of the information provided in this paper.
Yay intuition.
This problem makes me wonder... For what value of the probability (0.5 in this case) does the series diverge?
Log in to reply
Yeah, that's a great question! My reading of the Kolmogorov theorem is that if there is any deviation from the 0.5 probability then the series would almost surely diverge, (albeit incredibly slowly for values close to 0.5). That would be a surprising result, so it might be worth testing it out with a program.... someday. :)
Interesting... What about if you say there is a 2/3 chance it is +1 and a 1/3 chance it is -2?
Log in to reply
That would appear to have a shot at converging almost surely, but I'm not certain it would satisfy condition (iii) of the Kolmogorov theorem. Need to refresh on variances ...
Log in to reply
The third condition just becomes (for A = 3 ) n = 1 ∑ ∞ n 2 2 , which converges, so that example converges almost surely.
I should also point out that the three-series theorem is unnecessary here, since the first condition becomes trivial (the random variables in the sum converge uniformly to 0 ). It perhaps would be better to quote Kolmogorov's two-series thereom instead.
This is a really interesting problem, in particular, the probability distribution function f of the final sum. So happens that f ( 0 ) 4 1 and f ( 2 ) 8 1 . The (approach)[http://www.stat.ualberta.ca/people/schmu/preprints/rhs.pdf] found for deriving f is also really ingenius, since S is a continuous random variable that is made by the sum of discrete random variables, which should have made it hell to find f .
It might be interesting to you that f is smooth too.
If the distribution of sign is biased, then again by Kolmogorov's three-series theorem, the series diverges with probability 1. So this puts the case of random harmonic series special.
If you want to watch how it converges (generally to some number between -3 and 3 but can be more or less with low probability), this is a fun little program:
Python 2.7:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
|
Great! Thanks for posting this. Your result is consistent with one of the papers I linked to, in that the vast majority of the time the sum converges to a value between -3 and 3, with an average absolute value of between 1 and 1.25.
Log in to reply
Just curious, do you know of any random sums like this that only converge some of the time, instead of P ∈ { 1 , 0 } ? Or does the Law of Large Numbers say that this can't happen?
Log in to reply
No, it can't happen. Since the convergence of a random series is a tail event (i.e. independent of every finite subset of r.v.s) Kolmogorov's zero-one law says that probability of convergence is either zero or one.
Log in to reply
@Abhishek Sinha – Thanks for the concise answer. :)
@Abhishek Sinha – This is fascinating! Thank you so much for the link!
wow .... thanks :)
Does random.choice() returns random or pseudo-random number?
Log in to reply
Pseudo-random, but the distribution is correct enough to be useful. Most of the time, you don't want to use true random anyway because it will take too much work.
Even the random numbers from random.org aren't really truly "random"; their random numbers come from atmospheric noise.
Log in to reply
How to generate numbers (either 0 or 1) with custom probability in python? For example, I want 30% 0's and 70% 1's (for my minesweeper game, 0 denotes mines)
Log in to reply
@Pranjal Jain – You can use random.random to generate a random number between 0 and 1. Here's an example:
Python 2.7:
1 2 3 4 5 |
|
Log in to reply
@Brock Brown – Thanks! ⌣ ¨
Log in to reply
@Pranjal Jain – No problem!
If you've ever got any more Python questions let me know:
Let's write X n = f ( n ) / n . Then { X n } is a sequence of independent random variables, each with mean 0 . Now, there is a result (which is essentially a version of Kolmogorov's three-series theorem) that states that ∑ n ≥ 1 X n converges with probability 1 if ∑ n ≥ 1 Var ( X n ) < ∞ . Here, we indeed have ∑ n ≥ 1 Var ( X n ) < ∞ . Hence the given series converges with probability 1 .
Problem Loading...
Note Loading...
Set Loading...
Thsi is known as a random harmonic series , which as a consequence of Kolmogorov's three-series theorem converges with probability 1 .