0 as k 1 thus yy eventually the markov chain does not

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: probability of returning k times is ⇢k ! 0 as k ! 1. Thus, yy eventually the Markov chain does not find its way back to y . In this case the state y is called transient, since after some point it is never visited by the Markov chain. (ii) ⇢yy = 1: The probability of returning k times ⇢k = 1, so the chain returns yy to y infinitely many times. In this case, the state y is called recurrent, it continually recurs in the Markov chain. To understand these notions, we turn to our examples, beginning with Example 1.12. Gambler’s ruin. Consider, for concreteness, the case N = 4. 0 1 2 3 4 0 1 .6 0 0 0 1 0 0 .6 0 0 2 0 .4 0 .6 0 3 0 0 .4 0 0 4 0 0 0 .4 1 We will show that eventually the chain gets stuck in either the bankrupt (0) or happy winner (4) state. In the terms of our recent definitions, we will show that states 0 < y < 4 are transient, while the states 0 and 4 are recurrent. It is easy to check that 0 and 4 are recurrent. Since p(0, 0) = 1, the chain comes back on the next step w...
View Full Document

This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).

Ask a homework question - tutors are online