Expected Geometric Mean (Extended Version)

Calculus Level 4

Let G n G_n be the expected value of the geometric mean of n n real numbers independently and randomly chosen between 0 and 1.

What is lim n G n ? \displaystyle \lim_{n \rightarrow \infty} G_n ?


The answer is 0.367879.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

7 solutions

Stephen Brown
Oct 29, 2017

For n randomly chosen numbers a 1 , a 2 , . . . , a n a_1,a_2,...,a_n , the expected value can be found by integrating the geometric mean function over every 0 to 1 interval:

0 1 0 1 . . . 0 1 a 1 a 2 . . . a n n d a 1 d a 2 . . . d a n \int_{0}^{1}\int_{0}^{1}...\int_{0}^{1}\sqrt[n]{a_1a_2...a_n}\, da_1da_2...da_n

Each integral provides a factor of n n + 1 \frac{n}{n+1} , giving the total integral of ( n n + 1 ) n (\frac{n}{n+1})^{n} .

Now we can evaluate the limit as n approaches infinity; manipulating the expression, we get:

lim n ( n + 1 n ) n \lim_{n \to \infty }(\frac{n+1}{n})^{-n} = lim n ( ( 1 + 1 n ) n ) 1 \lim_{n \to \infty }((1+\frac{1}{n})^{n})^{-1} = e 1 0.367879 e^{-1} \approx 0.367879

Same solution ! I first thought to use Excel to approximate the result, but I end up by first finding n=2 case,then I found that it is easily extended to infinity.

Kelvin Hong - 3 years, 6 months ago

For n -> inf one of the number will be zero. So the product will be null and the mean also.

Osvaldo Guimarães - 3 years, 6 months ago

Log in to reply

The probability of selecting 0 for any given variable is in fact 0. This conclusion makes the erroneous assumption that infinity*0 > 0.

Stephen Brown - 3 years, 6 months ago

Log in to reply

In fact the probability is 0 for any number, but if randomic take infinity times, for sure it will be chosen. Try in the computer.

Osvaldo Guimarães - 3 years, 6 months ago

Log in to reply

@Osvaldo Guimarães You'll find that computers are notoriously unreliable when dealing with notions of infinity or infinitesimals. There's a limit to the accuracy of floating point numbers that will cause a computer to report an infinitesimally small number as being equal to 0, which is not the case. In this example, let's say we have 100 random numbers from 0 to 1 whose product winds up being 1 0 1000 10^{-1000} . You'll find that Python returns this number as 0.0. You'll also find that it returns ( 1 0 1000 ) 1 / 100 (10^{-1000})^{1/100} as 1.0. Clearly these are incorrect.

Consider also this (non-rigorous) argument, which may convince you that you cannot assume the product is 0: the real numbers between 0 and 1 inclusive are uncountably infinite, while the number of variables in our product is countably infinite. As the cardinality of an uncountably infinite set is strictly larger than the cardinality of a countably infinite set, it cannot be said with certainty that any particular real number must be chosen. The only reliable approach to such a problem is the use of continuous probability distributions.

Stephen Brown - 3 years, 6 months ago

@Osvaldo Guimarães How about we just talking in such Mathematical way to solve this?

Kelvin Hong - 3 years, 6 months ago

That was my assumption also. That for an infinite set of random values between 0 and 1, that a single 0 will mean that the geometric mean is zero as well.

Kurt Schwind - 3 years, 5 months ago

Same solution here! :D

Kevin Tong - 3 years, 6 months ago
Laszlo Kocsis
Nov 16, 2017

I had no idea how to solve this. I gave a shot at 1/e. Sorry folks... :)

what lucky guy

Haojun Liang - 3 years, 6 months ago

Log in to reply

Same for me :) in fact i guessed it should be linked with e because of the kind of infinite product and 1/n exponent... And as it was obviously between 0 and 1 i tried 1/e !

Julien Bonal - 3 years, 6 months ago

I wonder what fraction of the people who got this right simply guessed 1/e. I know I'm in the same boat as you.

J R - 3 years, 6 months ago
Ville Karlsson
Nov 16, 2017

Let a k a_k be random numbers uniformly distributed between 0 and 1. Then:

lim n a 1 a 2 . . . a n n = exp ( 1 n n = 1 log ( a n ) ) \lim_{n\to\infty} \sqrt[n]{a_1a_2...a_n}=\exp(\frac{1}{n}\sum_{n=1}^{\infty} \log{(a_n)})

Then from Monte Carlo integration it follows that:

1 n n = 1 log ( a n ) = 0 1 log ( x ) d x = log ( 1 ) 1 = 1 \frac{1}{n}\sum_{n=1}^{\infty} \log{(a_n)}=\int_{0}^{1} \log{(x)}dx= \log{(1)}-1 = -1

And so finally

a 1 a 2 . . . a n n = exp ( 1 ) \sqrt[n]{a_1a_2...a_n}=\exp(-1)

James Gilbert
Nov 16, 2017

Using logs and rules of indices you get
n = 0 G n = e n = 0 log ( G n ) \prod _{ n=0 }^{ \infty }{ G_n } = e^{\sum_{n = 0}^{\infty} \log(G_n)}

by finding the (arithmetic) average of log ( G ) \log(G) via rules of indices, it should follow that you can exponentiate it and the geometric average of G G

the arithmetic mean of a function over a range can be attained by integrating it over the range and dividing it by the length of the range (which happens to be 1) log ( G ) ˉ = 0 1 log ( x ) . d x = lim a 0 [ x ( l o g ( x ) 1 ) ] a 1 = lim a 0 ( 1 a ( log ( a ) 1 ) ) = 1 \bar{\log(G)} = \int_0^1 \log(x).dx = \lim_{a \to 0} \left[ x\ \left(log(x) - 1\right) \right]^1_a = \lim_{a \to 0}\left(-1 - a \left(\log(a) - 1\right)\right) = -1

if we exponentiate 1 -1 we get 1 e \frac{1}{e}

Alex Letizia
Nov 17, 2017

If numbers are picked randomly between 0 and 1, then those numbers are in the form 1/n, 2/n, ... n/n. If we do the GM of this sequence we obtain (n!/n^n)^(1/n) = (n!)^(1/n)/n. By Stirling then n! ~ k sqrt(n) (n/e) so taking the limit as n-> + infty we get 1/n..

I've done it the same way.

Martin Gütlbauer - 3 years, 6 months ago

YfWrp Vlrd

S R - 3 years, 5 months ago

My approach, as engineer by the trade, is a kind of heterodox .

You can consider the natural logarithm of geometrical mean as`equivalent to the edge of of an hyper-cube which volume equal to the product of the N values to be considered.

So by taking logarithm at both sides of the geometrical mean formula we have Ln ( Gm) = [ Ln (x1)+Ln(x2)+...Ln(xN) ] / N. Being infinite and equally probable all values x(j) between 0 and the numerator of quantity under bracket is precisely the area of function Ln(x) between 0 and 1 which values -1. N = 1 since is the interval between [0,1].

Therefore Ln(Gm) = -1, then Gm = 1/e

The x's are chosen independently from a uniform distribution. G = x 1 x 2 . . . x n n G = x 1 1 n x 2 1 n . . . x n 1 n E [ G ] = ( E [ x 1 1 n ] ) n E [ x 1 1 n ] = 1 b a a b x 1 1 n d x 1 E [ x 1 1 n ] = 1 1 0 0 1 x 1 1 n d x 1 E [ x 1 1 n ] = n n + 1 x 1 n + 1 n 0 1 E [ x 1 1 n ] = n n + 1 E [ G ] = ( n n + 1 ) n = ( n + 1 n ) n = ( ( 1 + 1 n ) n ) 1 lim n E [ G ] = lim n ( ( 1 + 1 n ) n ) 1 = e 1 \begin{array}{l} G = \sqrt[n]{{{x_1}{x_2}...{x_n}}}\\ G = {x_1}^{\frac{1}{n}}{x_2}^{\frac{1}{n}}...{x_n}^{\frac{1}{n}}\\ E\left[ G \right] = {\left( {E\left[ {{x_1}^{\frac{1}{n}}} \right]} \right)^n}\\ E\left[ {{x_1}^{\frac{1}{n}}} \right] = \frac{1}{{b - a}}\int\limits_a^b {{x_1}^{\frac{1}{n}}d{x_1}} \\ E\left[ {{x_1}^{\frac{1}{n}}} \right] = \frac{1}{{1 - 0}}\int\limits_0^1 {{x_1}^{\frac{1}{n}}d{x_1}} \\ E\left[ {{x_1}^{\frac{1}{n}}} \right] = \frac{n}{{n + 1}}\left. {{x_1}^{\frac{{n + 1}}{n}}} \right|_0^1\\ E\left[ {{x_1}^{\frac{1}{n}}} \right] = \frac{n}{{n + 1}}\\ E\left[ G \right] = {\left( {\frac{n}{{n + 1}}} \right)^n} = {\left( {\frac{{n + 1}}{n}} \right)^{ - n}} = {\left( {{{\left( {1 + \frac{1}{n}} \right)}^n}} \right)^{ - 1}}\\ \mathop {\lim }\limits_{n \to \infty } E\left[ G \right] = \mathop {\lim }\limits_{n \to \infty } {\left( {{{\left( {1 + \frac{1}{n}} \right)}^n}} \right)^{ - 1}} = {e^{ - 1}} \end{array}

Anthony Forgette - 3 years, 6 months ago
Davide Lombardi
Nov 17, 2017

It is possible to write G n G_{n} as

G n = e 1 n i = 1 n log ( a i ) = e E ( log ( X n ) ) G_{n} = e^{\frac{1}{n} \sum_{i=1}^n \log(a_i)} = e^{\mathrm{E}(\log(X_{n}))}

where X n X_{n} is the uniform discrete random variable in ]0;1[ and E denote the mean(expected value) of the random variable X n X_{n} .

To calculate lim n + G n \lim_{n \to +\infty}G_n it is sufficient to calculate the mean of the continuos random variable log(X), because for n + n \to +\infty the discrete uniform random variable X n X_{n} tends to continuos uniform random variable X X

Let Y = log ( X ) Y = \log(X) , then it is possible to calculate the probability density function(pdf) f Y f_{Y} of Y Y know the pdf of X X variable f X f_{X} with the following formula ( random variables )

f Y ( Y ) = f X ( Y ) d X d Y f_{Y}(Y) = f_{X}(Y) \frac{dX}{dY}

the calculation give

f Y ( Y ) = e Y f_{Y}(Y) = e^Y with Y ] ; 0 [ Y \in ]-\infty;0[

Then the Y is a random variable with exponetial distribution and E [ Y ] = E [ log ( X ) ] = 1 E[Y]=E[\log(X)] = -1 . substituting we obtain the desidered result

lim n + G n = lim n + e E ( log ( X n ) ) = e E ( log ( X ) ) = e E ( Y ) = e 1 \lim_{n \to +\infty}G_n = \lim_{n \to +\infty} e^{\mathrm{E}(\log(X_{n}))} = e^{\mathrm{E}(\log(X))} = e^{\mathrm{E}(Y)} = e^{-1} \approx 0.367879........

Mathematica : GeometricMean[Table[RandomReal[],10^6]]

Χωρις Ονομας - 3 years, 6 months ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...