This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: E3106, Solutions to Homework 3 Columbia University Problem 4.18 . De f ne X n = ½ if coin 1 is F ipped on the nth day 1 if coin 2 is F ipped on the nth day then { X n ,n ≥ } is an irreducible ergodic Markov chain with transition proba bility matrix P = ∙ . 6 . 4 . 5 . 5 ¸ . The limiting probabilities satisfy π + π 1 = 1 π = 0 . 6 π + 0 . 5 π 1 These solve to yield π = 5 9 , π 1 = 4 9 . (a) The desired proportion is equal to π = 5 9 . (b) P 4 = ∙ . 6 . 4 . 5 . 5 ¸ 4 = ∙ . 5556 . 4444 . 5555 . 4445 ¸ . The desired probability is equal to P 4 01 = 0 . 4444. Problem 4.20 We have an irreducible and aperiodic Markov chain with a f nite number of states { ,...,M } such that M X i =0 P ij = 1 : for all j ∈ { ,...,M } Since it has only one class with f nite number of states, the Markov chain is recurrent (remark (ii) page 193). Thus, it also positive recurrent (see page 200) as it has only f nite number of states. Hence, the limiting probabilities exist and are unqiue. Therefore, we only need to show that as π i = 1 M + 1 , : for all j ∈ { ,...,M } 1 solves (4.7) on page 201. This is true as π i = 1 M + 1 = 1 M + 1 M X i =0 P ij = M X i =0 1 M + 1 P ij = M X i =0 π i P ij M X j =0 π i = M X i =0 1 M + 1 = 1 M + 1 M X i =0 1 = M + 1 M + 1 = 1 ....
View
Full Document
 Spring '09
 Probability theory, yn, Markov chain, Andrey Markov, Random walk, Markov decision process

Click to edit the document details