Rubber Physics (Part 4)

Imagine a chain of N N chain links aligned parallel to the x x -axis. Each individual chain link has length l l and can be aligned either to the right in the + x +x direction (blue links) or to the left in the x -x direction (red links). In each case, we associate with the i th i^\text{th} chain link a direction vector l i = ± l e x , \vec l_i = \pm l \, \vec e_x, whose sign identifies the orientation of the link. The distance vector from start to end of the chain is r = l 1 + l 2 + + l N = i = 1 N l i . \vec r = \vec l_1 + \vec l_2 + \dots + \vec l_N = \sum_{i=1}^N \vec l_i. Let us assume that the alignment of the chain links is purely random and uncorrelated. The probabilities for the alignment of a chain link to the left and right are the same.

What is mean value r m = r 2 r_\text{m}= \sqrt{\big\langle \vec r^2 \big\rangle} of the distance between the start and end of the chain? Evaluate r m r_\text{m} for the case N = 1 0 4 N = 10^4 and give the result in units of l l .


The answer is 100.

This section requires Javascript.
You are seeing this because something didn't load right. We suggest you, (a) try refreshing the page, (b) enabling javascript if it is disabled on your browser and, finally, (c) loading the non-javascript version of this page . We're sorry about the hassle.

1 solution

Markus Michelmann
Jan 13, 2018

Instead of the distance vector r \vec r consider the number n = r e x l = 1 l i = 1 N l i e x = i = 1 N s i Z n = \frac{\vec r \cdot \vec e_x}{l} = \frac{1}{l} \sum_{i = 1}^N \vec l_i \cdot \vec e_x = \sum_{i=1}^{N} s_i \in \mathbb{Z} with the sign s i { + 1 , 1 } s_i \in \{+1,-1\} of the orientation vector l i = s i l e x \vec l_i = s_i l \vec e_x . This number can be expressed by n = N + N = 2 N + N n = N_+ - N_- = 2N_+ - N with the number of positive vectors, N + N_+ , and the number of negative vectors, N = N N + N_- = N - N_+ . The probability that there are exactly N + N_+ positive orientated chain links correspond to a binomial distribution: P N ( N + ) = N ! N + ! ( N N + ) ! p + N + p N N + = ( N N + ) p + N + p N N + P_N (N_+) = \frac{N!}{N_+! (N - N_+)!} p_+^{N_+} p_-^{N - N_+} = \left(\begin{array}{c} N \\ N_+ \end{array}\right) p_+^{N_+} p_-^{N - N_+} with probabilites p + p_+ and p = 1 p + p_- =1 - p_+ for the cases s i = + 1 s_i = +1 and s i = 1 s_i = -1 , respectively. Here, both probabilities are the same, so that p + = p = 1 2 p_+ = p_- = \frac{1}{2} . The mean value of the quatratic distance results r 2 = n 2 l 2 = ( 4 N + 2 4 N + N + N 2 ) l 2 \langle \vec r^2 \rangle = \langle n^2 \rangle l^2 = (4 \langle N_+^2 \rangle - 4 \langle N_+ \rangle N + N^2) l^2 These mean values N + \langle N_+ \rangle and N + 2 \langle N_+^2\rangle can be estimated as follows N + = N + = 0 N N + P N ( N + ) = N + = 0 N N + N ! N + ! ( N N + ) ! p + N + p N N + = N + = 1 N N ! ( N + 1 ) ! ( N N + ) ! p + N + p N N + = N + = 1 N N p + ( N 1 ) ! ( N + 1 ) ! ( ( N 1 ) ( N + 1 ) ) ! p + N + 1 p ( N 1 ) ( N + 1 ) = N p + N + = 0 N 1 ( N 1 ) ! N + ! ( ( N 1 ) N + ) ! p + N + p ( N 1 ) N + = N p + N + = 0 N 1 P N 1 ( N + ) = N p + N + 2 = N + = 0 N N + 2 P N ( N + ) = N + = 0 N N + ( N + 1 ) P N ( N + ) + N + = 0 N N + P N ( N + ) = N + = 0 N N + ( N + 1 ) N ! N + ! ( N N + ) ! p + N + p N N + + N + = N + = 2 N N ! ( N + 2 ) ! ( N N + ) ! p + N + p N N + + N p + = N + = 2 N N ( N 1 ) p + 2 ( N 2 ) ! ( N + 2 ) ! ( ( N 2 ) ( N + 2 ) ) ! p + N + 2 p ( N 2 ) ( N + 2 ) + N p + = N ( N 1 ) p + 2 N + = 0 N 2 ( N 2 ) ! N + ! ( ( N 2 ) N + ) ! p + N + p ( N 2 ) N + + N p + = N ( N 1 ) p + 2 N + = 0 N 2 P N 2 ( N + ) + N p + = N ( N 1 ) p + 2 + N p + \begin{aligned} \langle N_+ \rangle &= \sum_{N_+ = 0}^N N_+ P_N(N_+) \\ &= \sum_{N_+ = 0}^N N_+ \frac{N!}{N_+! (N - N_+)!} p_+^{N_+} p_-^{N - N_+} \\ &= \sum_{N_+ = 1}^N \frac{N!}{(N_+-1)! (N - N_+)!} p_+^{N_+} p_-^{N - N_+} \\ &= \sum_{N_+ = 1}^N N p_+ \cdot \frac{(N- 1)!}{(N_+-1)! ((N -1) - (N_+-1))!} p_+^{N_+-1} p_-^{(N-1) - (N_+-1)}\\ &= N p_+ \sum_{N_+' = 0}^{N-1} \frac{(N- 1)!}{N_+'! ((N -1) - N_+')!} p_+^{N_+'} p_-^{(N-1) - N_+'}\\ &= N p_+ \sum_{N_+' = 0}^{N-1} P_{N-1}(N_+') \\ &= N p_+\\ \langle N_+^2 \rangle &= \sum_{N_+ = 0}^N N_+^2 P_N(N_+)\\ &= \sum_{N_+ = 0}^N N_+(N_+ - 1) P_N(N_+) + \sum_{N_+ = 0}^N N_+ P_N(N_+) \\ &= \sum_{N_+ = 0}^N N_+(N_+ -1)\frac{N!}{N_+! (N - N_+)!} p_+^{N_+} p_-^{N - N_+} + \langle N_+ \rangle \\ &= \sum_{N_+ = 2}^N \frac{N!}{(N_+ - 2)! (N - N_+)!} p_+^{N_+} p_-^{N - N_+} + N p_+ \\ &= \sum_{N_+ = 2}^N N(N-1) p_+^2 \frac{(N - 2)!}{(N_+ - 2)! ((N - 2) - (N_+ - 2))!} p_+^{N_+ -2} p_-^{(N-2) -(N_+ - 2)} + N p_+ \\ &= N(N-1) p_+^2 \sum_{N_+' = 0}^{N-2} \frac{(N - 2)!}{N_+'! ((N - 2) - N_+')!} p_+^{N_+'} p_-^{(N-2) -N_+'} + N p_+ \\ &= N(N-1) p_+^2 \sum_{N_+' = 0}^{N-2} P_{N-2}(N_+') + N p_+ \\ &= N(N-1) p_+^2 + N p_+ \end{aligned} Therefore, we get the final result r 2 = ( 4 ( N ( N 1 ) p + 2 + N p + ) 4 N 2 p + + N 2 ) l 2 = ( 4 N ( N 1 ) ( p + 2 p + ) + N 2 ) l 2 = ( 4 N ( N 1 ) ( 1 4 1 2 ) + N 2 ) l 2 = ( N ( N 1 ) + N 2 ) l 2 = N l 2 \begin{aligned} \langle \vec r^2 \rangle &= (4 (N(N-1) p_+^2 + N p_+) - 4 N^2 p_+ + N^2) l^2 \\ &= (4 N(N-1) (p_+^2 - p_+) + N^2) l^2\\ &= \left(4 N(N-1) \left(\frac{1}{4} - \frac{1}{2} \right) + N^2 \right) l^2 \\ &= \left(- N(N-1) + N^2 \right) l^2 = N l^2 \end{aligned} r m = r 2 = N l = 100 l \Rightarrow \quad \boxed{r_m = \sqrt{\langle \vec r^2 \rangle } = \sqrt{N} \cdot l = 100 \cdot l }

I have approached this as a "random walk" problem in one dimension, and just looked up the formula to calculate the expected distance traveled. It is interesting to learn from your solution how the formula is derived.

Gediminas Sadzius - 1 year, 5 months ago

0 pending reports

×

Problem Loading...

Note Loading...

Set Loading...