This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Ergodic Markov Chains 11/17/2005 Definition • A Markov chain is called an ergodic chain if it is possible to go from every state to every state (not necessarily in one move). • Ergodic Markov chains are also called irreducible . • A Markov chain is called a regular chain if some power of the transition matrix has only positive elements. 1 Example • Let the transition matrix of a Markov chain be defined by P = 1 2 1 0 1 2 1 ¶ . • Then this is an ergodic chain which is not regular. 2 Example: Ehrenfest Model • We have two urns that, between them, contain four balls. • At each step, one of the four balls is chosen at random and moved from the urn that it is in into the other urn. • We choose, as states, the number of balls in the first urn. P = 1 2 3 4 1 1 1 / 4 3 / 4 2 1 / 2 1 / 2 3 3 / 4 1 / 4 4 1 . 3 Regular Markov Chains • Any transition matrix that has no zeros determines a regular Markov chain. • However, it is possible for a regular Markov chain to have a tran sition matrix that has zeros....
View
Full
Document
This note was uploaded on 07/16/2010 for the course MATH 20 taught by Professor Ionescu during the Fall '05 term at Dartmouth.
 Fall '05
 Ionescu
 Markov Chains, Probability

Click to edit the document details