This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: IEOR E4701 Assignment 3  Solutions Summer 2011 Question 1 Consider the Markov chain whose transition probabilities are given by P = 0 1 2 1 2 1 2 1 2 1 3 2 3 3 4 1 4 a) Is the chain irreducible? why? b) Is the chain positive recurrent? why? c) Compute the equilibrium distribution in two di erent ways. Answer a) Notice that p ( i,j ) > for every i 6 = j . This means that every state in the statesspace is reachable, independent of the starting point of the Markov chain. Therefore, the Markov chain is irreducible. b) Following the what we obseved in class, that every irreducible Markov chain is positively recurrent, we may conclude that the given Markov chain is positively recurrent. c) Since the given Markov chain is irreducible and positive recurrent we know that the equilibrium (steady state) distribution, π , is unique. We will nd π using two di rent methods; the rst one via the formula π T = π T P and the second one through the connection π ( i ) = 1 E i τ i where τ i = min { n ≥ 1 : X n = i } , for i = 0 , 1 , 2 . Method 1: π T = π T P ( π (0) ,π (1) ,π (2)) = ( π (0) ,π (1) ,π (2)) 1 2 1 2 1 3 2 3 3 4 1 4 which leads to the following system of linear equations π (0) = 1 3 π (1) + 3 4 π (2) π (1) = 1 2 π (0) + 1 4 π (2) π (2) = 1 2 π (0) + 2 3 π (1) = ⇒ π = 4 11 , 3 11 , 4 11 . 1 IEOR E4701 Assignment 3  Solutions Summer 2011 Method 2: π ( i ) = 1 E i τ i where τ i = min { n ≥ 1 : X n = i } , for i = 0 , 1 , 2 . For i = 0 . E τ = E ( τ  X = 0) = p (0 , 1) E ( τ  X = 0 , X 1 = 1) + p (0 , 2) E ( τ  X = 0 , X 1 = 2) = Markov property 1 2 E ( T  X = 1) + 1 2 E ( T  X = 2) where T = min { n ≥ 0 : X n = 0 } . Let g ( i ) = E ( T  X = i ) . Then, E τ = 1 2 g (1) + 1 2 g (2) . To nd g ( i ) we need to set up the following system of linear equations: For i = 0 : g (0) = 0 For i = 1 : g (1) = E ( T  X = 1) = E ( T  X = 1 , X 1 = 0) p (1 , 0) + E ( T  X = 1 , X 1 = 2) p (1 , 2) + 1 = Markov property 1 · 1 3 + g (2) · 2 3 + 1 = 2 3 g (2) + 4 3 For i = 2 : g (2) = E ( T  X = 2) = E ( T  X = 2 , X 1 = 0) p (2 , 0) + E ( T  X = 2 , X 1 = 1) p (2 , 1) + 1...
View
Full Document
 Summer '10
 KarlSigma
 Markov chain, IEOR E4701

Click to edit the document details