MIT6_041F10_assn09_sol

MIT6_041F10_assn09_sol - Massachusetts Institute of...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Fall 2010) Problem Set 9 Solutions 1. (a) Yes, to 0. Applying the weak law of large numbers, we have P ( | U i | > ) 0 as i , for all > 0 Here = 0 since X i U ( 1 . , 1 . 0). (b) Yes, to 1. Since W i 1, we have for > 0, lim P ( | W i 1 | > ) = lim P (max { X 1 , ,X i } < 1 ) i i = lim P ( X 1 < 1 ) P ( X i < 1 ) } i = lim i (1 2 ) i = . (c) Yes, to 0. | V n | min {| X 1 | , | X 2 | , , | X n |} but min {| X 1 | , | X 2 | , , | X n |} converges to 0 in probability. So, since | V n | 0, | V n | converges to 0 in probability. To see why min {| X 1 | , | X 2 | , , | X n |} converges to 0 in probability, note that: lim P ( | min {| X 1 | , , | X i |} | > ) = lim P (min {| X 1 | , , | X i |} > ) i i = lim P ( | X 1 | > ) P ( | X 2 | > ) P ( | X i | > ) i = lim (1 + ) i since | X i | is uniform between 0 and 1 i = . 2. Consider a random variable X with PMF p, if x = c ; p X ( x ) = p, if x = + c ; 1 2 p, if x = . The mean of X is , and the variance of X is 2 pc 2 . To make the variance equal 2 , set p = 2 2 . 2 c For this random variable, we have 2 P ( | X | c ) = 2 p = 2 , c and therefore the Chebyshev inequality is tight. 3. (a) Let t i be the expected time until the state HT is reached, starting in state i , i.e., the mean first passage time to reach state HT starting in state i . Note that t S is the expected number of tosses until first observing heads directly followed by tails. We have, 1 1 t S = 1 + t H + t T 2 2 1 1 t T = 1 + 2 t H + 2 t T 1 t H = 1 + 2 t H Page 1 of 7 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Fall 2010) and by solving these equations, we find that the expected number of tosses until first ob- serving heads directly followed by tails is t S = 4 . (b) To find the expected number of additional tosses necessary to again observe heads followed by tails, we recognize that this is the mean recurrence time t of state HT . This can be HT determined as t HT = 1 + p HT,H t H + p HT,T t T 1 1 = 1 + 2 + 4 2 2 = 4 . (c) Lets consider a Markov chain with states S,H,T,TT , where S is a starting state, H indi- cates heads on the current toss, T indicates tails on the current toss (without tails on the previous toss), and TT indicates tails over the last two tosses. The transition probabilities for this Markov chain are illustrated below in the state transition diagram: H T T T S 2 1 2 1 2 1 2 1 2 1 2 1 1 2 1 2 Let t i be the expected time until the state TT is reached, starting in state i , i.e., the mean first passage time to reach state...
View Full Document

This note was uploaded on 01/11/2012 for the course EE 6.431 taught by Professor Prof.dimitribertsekas during the Fall '10 term at MIT.

Page1 / 8

MIT6_041F10_assn09_sol - Massachusetts Institute of...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online