hw10 sol - X n = 1) 6 = p ( X n +1 = 2 | X n = 1 , X n-1 =...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
EE 351K Probability and Random Processes FALL 2010 Instructor: Haris Vikalo hvikalo@ece.utexas.edu Homework 10 solutions Problem 1 Let the random variable X i take value 1 if it rained else 0 in the past on the i th day from today ( X 1 is 1 if it rained yesterday). Let Y = X 1 X 2 X 3 ( Y is the vector random variable with the information about rain for the past 3 days). Since the probability that it will rain today is completely determined from the past three days, Y is a markov chain. Now the states of Y are defined as follows, 000 as state 0 , 001 - 1 , { 010 , 011 } - 2 , { 100 , 101 } - 3 , 110 - 4 , 111 - 5 . The transition probability matrix is 0 . 8 0 0 0 . 2 0 0 0 . 6 0 0 0 . 4 0 0 0 0 . 6 0 0 . 4 0 0 0 0 0 . 4 0 0 . 6 0 0 0 0 . 4 0 0 0 . 6 0 0 0 . 2 0 0 0 . 8 Problem 2 X n is not a markov chain. We prove this by showing p ( X n +1 = 2 |
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X n = 1) 6 = p ( X n +1 = 2 | X n = 1 , X n-1 = 0) . p ( X n +1 = 2 | X n = 1) = p ( X n +1 = 2 , X n = 1) /p ( X n = 1) = 1 8 / 1 2 = 1 4 p ( X n +1 = 2 | X n = 1 , X n-1 = 0) = p ( Y n +1 = 1) = 1 2 Problem 3 0.5 0.25 0.5 0.25 0.25 0.25 0.5 0.5 H M E Let the states of the Markov chain are 0-Easy, 1-Medium, 2-Hard. The transition probability matrix P is . 5 . 25 . 25 . 25 . 5 . 25 . 5 . 5 Steady state probabilities calculated using the equation = P are = 2 5 , 1 = 2 5 , 2 = 1 5 ....
View Full Document

This note was uploaded on 04/10/2011 for the course EE 351k taught by Professor Bard during the Spring '07 term at University of Texas at Austin.

Ask a homework question - tutors are online