To be able to analyze any nite state markov chain we

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: of Theorem 1.5 is Lemma 1.6. If x is recurrent and ⇢xy > 0 then ⇢yx = 1. Proof. If ⇢yx < 1 then Lemma 1.5 would imply x is transient. To be able to analyze any finite state Markov chain we need some theory. To motivate the developments consider Example 1.14. A Seven-state chain. Consider the transition probability: 1 2 3 4 5 6 7 1 .7 .1 0 0 .6 0 0 2 0 .2 0 0 0 0 0 3 0 .3 .5 0 0 0 0 4 0 .4 .3 .5 0 0 1 5 .3 0 .2 0 .4 0 0 6 0 0 0 .5 0 .2 0 7 0 0 0 0 0 .8 0 To identify the states that are recurrent and those that are transient, we begin by drawing a graph that will contain an arc from i to j if p(i, j ) > 0 and i 6= j . We do not worry about drawing the self-loops corresponding to states with p(i, i) > 0 since such transitions cannot help the chain get somewhere new. In the case under consideration the graph is 1 2 6 ? 5 ? 3 -4 ✓6 -6 7 The state 2 communicates with 1, which does not communicate with it, so Theorem 1.5 implies that 2 is transient. Likewise 3 communicates with 4, which does...
View Full Document

This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).

Ask a homework question - tutors are online