{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

hw10 sol

# hw10 sol - X n = 1 6 = p X n 1 = 2 | X n = 1 X n-1 = 0 p X...

This preview shows page 1. Sign up to view the full content.

EE 351K Probability and Random Processes FALL 2010 Instructor: Haris Vikalo [email protected] Homework 10 solutions Problem 1 Let the random variable X i take value 1 if it rained else 0 in the past on the i th day from today ( X 1 is 1 if it rained yesterday). Let Y = X 1 X 2 X 3 ( Y is the vector random variable with the information about rain for the past 3 days). Since the probability that it will rain today is completely determined from the past three days, Y is a markov chain. Now the states of Y are defined as follows, 000 as state 0 , 001 - 1 , { 010 , 011 } - 2 , { 100 , 101 } - 3 , 110 - 4 , 111 - 5 . The transition probability matrix is 0 . 8 0 0 0 . 2 0 0 0 . 6 0 0 0 . 4 0 0 0 0 . 6 0 0 . 4 0 0 0 0 0 . 4 0 0 . 6 0 0 0 0 . 4 0 0 0 . 6 0 0 0 . 2 0 0 0 . 8 Problem 2 X n is not a markov chain. We prove this by showing p ( X n +1 = 2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X n = 1) 6 = p ( X n +1 = 2 | X n = 1 , X n-1 = 0) . p ( X n +1 = 2 | X n = 1) = p ( X n +1 = 2 , X n = 1) /p ( X n = 1) = 1 8 / 1 2 = 1 4 p ( X n +1 = 2 | X n = 1 , X n-1 = 0) = p ( Y n +1 = 1) = 1 2 Problem 3 0.5 0.25 0.5 0.25 0.25 0.25 0.5 0.5 H M E Let the states of the Markov chain are 0-Easy, 1-Medium, 2-Hard. The transition probability matrix P is . 5 . 25 . 25 . 25 . 5 . 25 . 5 . 5 Steady state probabilities calculated using the equation π = πP are π = 2 5 , π 1 = 2 5 , π 2 = 1 5 ....
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online