Unformatted text preview: X n = 1) 6 = p ( X n +1 = 2  X n = 1 , X n1 = 0) . p ( X n +1 = 2  X n = 1) = p ( X n +1 = 2 , X n = 1) /p ( X n = 1) = 1 8 / 1 2 = 1 4 p ( X n +1 = 2  X n = 1 , X n1 = 0) = p ( Y n +1 = 1) = 1 2 Problem 3 0.5 0.25 0.5 0.25 0.25 0.25 0.5 0.5 H M E Let the states of the Markov chain are 0Easy, 1Medium, 2Hard. The transition probability matrix P is . 5 . 25 . 25 . 25 . 5 . 25 . 5 . 5 Steady state probabilities calculated using the equation π = πP are π = 2 5 , π 1 = 2 5 , π 2 = 1 5 ....
View
Full Document
 Spring '07
 BARD
 Probability, Probability theory, Stochastic process, Markov chain, transition probability matrix, 0.5 0.5 0.25 0.25 1 2 M

Click to edit the document details