Toss the coin 2017 times

There is a biased coin whose probability of getting a head at the i th i^\text{th} flip in a game of 2017 flips be given by sin 2 ( i π 2017 ) \sin^2\left(\frac{i\pi}{2017}\right) .

If the variance of this probability distribution can be expressed as p q \frac pq , where p p and q q are coprime positive integers, find p + q p+q .


Bonus: Generalize this.


The answer is 2025.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

2 solutions

Kartik Sharma
Jan 7, 2017

Let's generalize it a little bit.

We consider a binomial experiment where probability of success changes after every trial.

Let us denote probability of success at i i th trial to be p ( i ) p(i) and that of failure at i i th trial to be q ( i ) = 1 p ( i ) q(i) = 1 -p(i)

So, what's the probability of 1 1 success in a sequence of n n trials.

P ( 1 s u c c e s s ) = p ( 1 ) q ( 2 ) q ( 3 ) q ( n ) + q ( 1 ) p ( 2 ) q ( 3 ) q ( n ) + + q ( 1 ) q ( 2 ) q ( 3 ) p ( n ) P(1 success) = p(1) q(2) q(3) \cdots q(n) + q(1) p(2) q(3) \cdots q(n) + \cdots + q(1) q(2) q(3) \cdots p(n)

Obviously in layman's term - when p ( 1 ) p(1) comes, q ( 1 ) q(1) does not.

So, how can we make a model for that given that we'd have to find for m m successes?

We use generating functions.

P ( 1 success ) = [ x 1 ] ( p ( 1 ) x + q ( 1 ) ) ( p ( 2 ) x + q ( 2 ) ) ( p ( 3 ) x + q ( 3 ) ) ( p ( n ) x + q ( n ) ) P(\text{1 success}) = [x^1] { (p(1) x + q(1)) (p(2) x + q(2)) (p(3) x + q(3)) \cdots (p(n) x + q(n)) }

where [ x k ] f ( x ) [x^k] { f(x)} means coefficient of x k x^k in f ( x ) f(x)

What about 2 success \text{2 success} ? We want 2 p ( i ) 2 p(i) 's and rest q q 's such that "when p ( 1 ) p(1) comes, q ( 1 ) q(1) does not."

So, very clearly, it is coefficient of x 2 x^2 in our given generating function.

Therefore, P ( m successes ) = a m P(\text{m successes}) = a_m , where f ( x ) = ( p ( 1 ) x + q ( 1 ) ) ( p ( 2 ) x + q ( 2 ) ) ( p ( 3 ) x + q ( 3 ) ) ( p ( n ) x + q ( n ) ) = a 0 + a 1 x + a 2 x 2 + + a n x n f(x) = (p(1) x + q(1)) (p(2) x + q(2)) (p(3) x + q(3)) \cdots (p(n) x + q(n)) = a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n

So that's our new binomial distribution and we can closed forms of problems like odd number of successes etc

We would need to find expected number of successes .

E ( X ) = k = 0 n P ( m successes ) m k = 0 n P ( m successes ) \displaystyle E(X) = \dfrac{\sum_{k=0}^{n}{P(\text{m successes}) m}}{\sum_{k=0}^{n}{P(\text{m successes})}}

= a 0 0 + a 1 1 + a 2 2 + + a n n a 0 + a 1 + a 2 + + a n \displaystyle = \dfrac{a_0 0 + a_1 1 + a_2 2 + \cdots + a_n n}{a_0 + a_1 + a_2 + \cdots + a_n}

= f ( 1 ) f ( 1 ) \displaystyle = \dfrac{f'(1)}{f(1)}

= d d x ( log ( f ( x ) ) ) x = 1 \displaystyle = \dfrac{d}{dx} (\log(f(x))) |_{x=1}

= i = 1 n d d x ( p ( i ) x + q ( i ) ) x = 1 \displaystyle = \sum_{i=1}^n {\frac{d}{dx} (p(i) x + q(i)) } |_{x=1}

E ( X ) = i = 1 n p ( i ) \displaystyle \large \boxed{\displaystyle E(X) = \sum_{i=1}^n {p(i)}}

Variance = σ 2 = a 0 0 2 + a 1 1 2 + a 2 2 2 + + a n n 2 a 0 + a 1 + + a n E 2 \displaystyle \text{Variance} = \sigma^2 = \dfrac{a_0 0^2 + a_1 1^2 + a_2 2^2 + \cdots + a_n n^2}{a_0 + a_1 + \cdots + a_n} - E^2

= ( x f ( x ) ) x x = 1 E 2 \displaystyle = \frac{(xf'(x))'}{x} |_{x=1} - E^2

= x f ( x ) f ( x ) x = 1 + f ( x ) f ( x ) x = 1 E 2 \displaystyle = \frac{xf''(x)}{f(x)} |_{x=1} + \frac{f'(x)}{f(x)} |_{x=1} - E^2

It can be shown that f ( 1 ) = ( p ( 1 ) + p ( 2 ) + + p ( n ) ) 2 p ( 1 ) 2 p ( 2 ) 2 p ( n ) 2 f''(1) = (p(1) + p(2) + \cdots + p(n))^2 - p(1)^2 - p(2)^2 - \cdots - p(n)^2 .

f ( 1 ) = 1 , f ( 1 ) = p ( 1 ) + p ( 2 ) + + p ( n ) f(1) = 1, f'(1) = p(1) + p(2) + \cdots + p(n) .

= E 2 i = 1 n p ( i ) 2 + i = 1 n p ( i ) E 2 \displaystyle = E^2 - \sum_{i=1}^n{p(i)^2} + \sum_{i=1}^n {p(i)} - E^2

σ 2 = i = 1 n p ( i ) i = 1 n ( p ( i ) ) 2 \displaystyle \large \boxed{\displaystyle \sigma^2 = \sum_{i=1}^n{p(i)} - \sum_{i=1}^n {(p(i))^2}}

Check for normal binomial distribution!

For our case, Variance = σ 2 = i = 1 2017 sin 2 ( i π 2017 ) i = 1 n sin 4 ( i π 2017 ) \text{Variance} = \displaystyle \sigma^2 = \sum_{i=1}^{2017}{\sin^2\left(\frac{i\pi}{2017}\right)} - \sum_{i=1}^n {\sin^4\left(\frac{i\pi}{2017}\right)}

i = 1 n sin 2 ( i π n ) = i = 1 n 1 2 ( 1 cos ( 2 i π n ) ) = n 2 \displaystyle \sum_{i=1}^n{\sin^2\left(\frac{i\pi}{n}\right)} = \sum_{i=1}^n{\frac{1}{2}\left(1 - \cos\left(\frac{2i\pi}{n}\right)\right)} = \frac{n}{2}

i = 1 n sin 4 ( i π n ) = i = 1 n 1 4 ( 1 cos ( 2 i π n ) ) 2 \displaystyle \sum_{i=1}^n{\sin^4\left(\frac{i\pi}{n}\right)} = \sum_{i=1}^n{\frac{1}{4}{\left(1 - \cos\left(\frac{2i\pi}{n}\right)\right)}^2}

= i = 1 n 1 4 ( 1 + 1 2 ( 1 + cos ( 4 i π n ) ) ) = n 4 + n 8 = 3 n 8 \displaystyle = \sum_{i=1}^n{\frac{1}{4}\left(1 + \frac{1}{2}\left(1 + \cos\left(\frac{4i\pi}{n}\right)\right)\right)} = \frac{n}{4} + \frac{n}{8} = \frac{3n}{8}

Here I have repeatedly used cos ( 2 x ) = 2 cos 2 ( x ) 1 = 1 2 sin 2 ( x ) \cos(2x) = 2\cos^2(x) - 1 = 1 - 2\sin^2(x) and cos ( α ) + cos ( α + β ) + + cos ( α + ( n 1 ) β ) = sin ( n β 2 ) sin ( β 2 ) cos ( α + ( n 1 ) β 2 ) \cos(\alpha) + \cos(\alpha + \beta) + \cdots + \cos(\alpha + (n-1)\beta) = \frac{\sin\left(\frac{n\beta}{2}\right)}{\sin\left(\frac{\beta}{2}\right)} \cos\left(\alpha + \frac{(n-1)\beta}{2}\right) .

This yields our answer as n 8 = 2017 8 \frac{n}{8} = \frac{2017}{8}

Hi Kartik, I see that you have derived the equations for the expected value and the variance, but I think you are missing the evaluation of the sums of sin 2 x \sin^2x and sin 4 x \sin^4 x .

I think the evaluation is not immediate, and it would be great if you could elaborate on it in your solution. What do you think?

Pranshu Gaba - 4 years, 4 months ago

Log in to reply

Yes, they are quite neat. I have added it. Can you check if it is what you wanted?

Kartik Sharma - 4 years, 4 months ago

Log in to reply

Thanks, Kartik. This helps me understand how the sum of sin 2 x \sin^2 x and sin 4 x \sin^4 x was calculated.

Pranshu Gaba - 4 years, 4 months ago
Pi Han Goh
Jan 31, 2017

This solution is incomplete.


Essentially, we want to evaluate Var ( x ) = E ( X 2 ) ( E ( X ) ) 2 \text{Var}(x) = E(X^2) - (E(X))^2 , where E ( X n ) E(X^n) denotes the n th n^\text{th} moment, E ( X n ) = j = 1 2017 ( sin 2 j π 2017 ) n \displaystyle E(X^n) = \sum_{j=1}^{2017} \left (\sin^2 \dfrac{j \pi}{2017} \right)^n .

So this boils down to a trigonometric expression.

Var ( x ) = E ( X 2 ) ( E ( X ) ) 2 = j = 1 2017 ( sin 2 j π 2017 ) 2 j = 1 2017 ( ( sin 2 j π 2017 ) ) 2 = j = 1 2017 ( sin 2 j π 2017 ) ( 1 sin 2 j π 2017 ) = j = 1 2017 ( sin 2 j π 2017 cos 2 j π 2017 ) = 1 4 j = 1 2017 4 ( sin 2 j π 2017 cos 2 j π 2017 ) = 1 4 j = 1 2017 sin 2 2 j π 2017 because sin ( 2 A ) = 2 sin A cos A = 1 4 j = 1 2017 1 cos ( 4 j π / 2017 ) 2 because sin 2 ( A ) = 1 2 ( 1 cos ( 2 A ) ) = 1 8 j = 1 2017 1 1 8 j = 1 2017 cos ( 4 π j 2017 ) \begin{aligned} \text{Var}(x) &=& E(X^2) - (E(X))^2 \\ &=& \sum_{j=1}^{2017} \left (\sin^2 \dfrac{j \pi}{2017} \right)^2 - \sum_{j=1}^{2017} \left( \left (\sin^2 \dfrac{j \pi}{2017} \right) \right)^2 \\ &=& \sum_{j=1}^{2017} \left (\sin^2 \dfrac{j \pi}{2017} \right) \left( 1 - \sin^2 \dfrac{j \pi}{2017}\right) \\ &=& \sum_{j=1}^{2017} \left (\sin^2 \dfrac{j \pi}{2017} \cos^2 \dfrac{j \pi}{2017}\right) \\ &=& \dfrac14 \sum_{j=1}^{2017} 4 \left (\sin^2 \dfrac{j \pi}{2017} \cos^2 \dfrac{j \pi}{2017}\right) \\ &=& \dfrac14 \sum_{j=1}^{2017} \sin^2 \dfrac{2 j \pi}{2017} \qquad \qquad \qquad \qquad \text{ because } \sin(2A) = 2\sin A \cos A \\ &=& \dfrac14 \sum_{j=1}^{2017} \dfrac{1 - \cos(4j\pi/2017)}{2}\qquad \qquad \qquad \text{ because } \sin^2(A) = \frac12 (1-\cos(2A)) \\ &=& \dfrac18 \sum_{j=1}^{2017} 1 -\dfrac18 \sum_{j=1}^{2017} \cos \left( \dfrac{4\pi j}{2017} \right) \\ \end{aligned}

What's left to do is to show that j = 1 2017 cos ( 4 π j 2017 ) = 0 \displaystyle \sum_{j=1}^{2017} \cos \left( \dfrac{4\pi j}{2017} \right) = 0 , which can be solved using Chebyshev polynomial of the first kind or roots of unity . I'll finish this whenever I feel like it.

I never knew Variance had such a formula and was so happy that I derived something new.

Kartik Sharma - 4 years, 4 months ago

Log in to reply

The formula for Variance is just Var(X) = E((X - mean)^2)

Pi Han Goh - 4 years, 4 months ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...