ass1sol - STAT455/855 Fall 2009 Applied Stochastic...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT455/855 Fall 2009 Applied Stochastic Processes Assignment #1, Solutions Total Marks: 35 for 455 and 40 for 855. 1. Ross, Chapter 3 #28. (12 marks) Let us define Y i = ± 1 if the i th draw is red 0 if the i th draw is blue . Then X k = Y 1 + ... + Y k . (a) (1 mark) The expected value of X 1 is the expected value of Y 1 , which is just the probability that the first draw is a red ball, that is, E [ X 1 ] = E [ Y 1 ] = r/ ( r + b ). (b) (2 marks) To get E [ X 2 ] we first compute E [ Y 2 ]. Conditioning on X 1 , we have E [ Y 2 ] = E [ Y 2 ² ² X 1 = 0] P ( X 1 = 0) + E [ Y 2 ² ² X 1 = 1] P ( X 1 = 1) = P (2nd draw is red ² ² 1st draw is blue) b r + b + P (2nd draw is red ² ² 1st draw is red) r r + b = ³ r r + b + m ´³ b r + b ´ + ³ r + m r + b + m ´³ r r + b ´ = r ( r + b + m ) ( r + b )( r + b + m ) = r r + b . Therefore, E [ X 2 ] = E [ Y 1 ] + E [ Y 2 ] = r r + b + r r + b = 2 r r + b . (c) (3 marks) Similarly, to get E [ X 3 ] we first compute E [ Y 3 ] and condition on X 2 to get E [ Y 3 ]. The conditional expectation of Y 3 given that X 2 = j , for j = 0 , 1 , 2, is E [ Y 3 ² ² X 2 = j ] = P (3rd draw is red ² ² j red in first 2 draws) = r + jm r + b + 2 m .
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
STAT 455/855 -- Assignment 1 Solutions p. 2 Therefore, conditioning on X 2 we can compute E [ Y 3 ] as E [ Y 3 ] = 2 X j =0 E [ Y 3 ± ± X 2 = j ] P ( X 2 = j ) = 2 X j =0 ² r + jm r + b + 2 m ³ P ( X 2 = j ) = r + mE [ X 2 ] r + b + 2 m = r + m 2 r/ ( r + b ) r + b + 2 m = r ( r + b + 2 m ) ( r + b )( r + b + 2 m ) = r r + b . Then we have E [ X 3 ] = E [ Y 1 ] + E [ Y 2 ] + E [ Y 3 ] = r r + b + r r + b + r r + b = 3 r r + b . (d) (3 marks) From parts (a), (b), and (c), clearly the pattern seems to be E [ X k ] = kr/ ( r + b ). We can verify this conjecture by showing that E [ Y i ] = r/ ( r + b ) for all i 1. We have already shown that it is true for i = 1 , 2 , 3. We will use an induction argument, so assume the pattern holds for i = 1 ,...,k - 1. After k - 1 draws, given X k - 1 = j the urn contains r + mj red balls and a total of r + b + ( k - 1) m balls. Therefore, E [ Y k ± ± X k - 1 = j ] = r + mj r + b +( k - 1) m , so conditioning on X k - 1 we have E [ Y k ] = k - 1 X j =0 E [ Y k ± ± X k - 1 = j ] P ( X k - 1 = j ) = k - 1 X j =0 r + mj r + b + ( k - 1) m P ( X k - 1 = j ) = r + mE [ X k - 1 ] r + b + ( k - 1) m = r + m ( E [ Y 1 ] + ... + E [ Y k - 1 ]) r + b + ( k - 1) m = r + m ( k - 1) r/ ( r + b ) r + b + ( k - 1) m = r r + b , where the fifth equality follows from the induction hypothesis. (e) (3 marks) We define the types as suggested in the hint. The symmetry argument is that each type should evolve statistically identically so that the expected number of type i balls in the urn after k draws is the same as the expected number of
Background image of page 2
STAT 455/855 -- Assignment 1 Solutions p. 3 type j balls in the urn after k draws, for any i 6 = j and for all k 1. Since after k draws there are exactly r + b + km balls in the urn and there are r + b types the expected number of type i balls should be r + b + km r + b = 1 + km r + b . Since
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 09/15/2010 for the course STAT 455/855 taught by Professor Glentakahara during the Fall '09 term at Queens University.

Page1 / 9

ass1sol - STAT455/855 Fall 2009 Applied Stochastic...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online