This preview shows page 1. Sign up to view the full content.
Unformatted text preview: X n = 1) 6 = p ( X n +1 = 2  X n = 1 , X n1 = 0) . p ( X n +1 = 2  X n = 1) = p ( X n +1 = 2 , X n = 1) /p ( X n = 1) = 1 8 / 1 2 = 1 4 p ( X n +1 = 2  X n = 1 , X n1 = 0) = p ( Y n +1 = 1) = 1 2 Problem 3 0.5 0.25 0.5 0.25 0.25 0.25 0.5 0.5 H M E Let the states of the Markov chain are 0Easy, 1Medium, 2Hard. The transition probability matrix P is . 5 . 25 . 25 . 25 . 5 . 25 . 5 . 5 Steady state probabilities calculated using the equation = P are = 2 5 , 1 = 2 5 , 2 = 1 5 ....
View
Full
Document
This note was uploaded on 04/10/2011 for the course EE 351k taught by Professor Bard during the Spring '07 term at University of Texas at Austin.
 Spring '07
 BARD

Click to edit the document details