This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: (a) ( ) ⋅ 1 3 2 5 x x = ? (b) ( ) x x 2 5 1 3 ⋅ = ? (c) ? 5 3 4 7 2 1 8 3 2 = ⋅ x 2 3. A stationary discretetime Markov chain has 3 states, and the following transition probability matrix P from one step in time to the next (i.e, from time 0 to time 1) . = 4 . 6 . 9 . 1 . 5 . 3 . 2 . P (a) Sketch the state transition diagram for the Markov chain. (b) What is the transition probability matrix from time 0 to time 2 ? (c) What is the transition probability matrix from time 0 to time 4 ?...
View
Full
Document
This note was uploaded on 03/17/2010 for the course IOE 316 taught by Professor Dolinskaya during the Winter '08 term at University of Michigan.
 Winter '08
 Dolinskaya

Click to edit the document details