4 - Copyright c 2009 by Karl Sigman 1 Limiting distribution...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Copyright c 2009 by Karl Sigman 1 Limiting distribution for a Markov chain In these Lecture Notes, we shall study the limiting behavior of Markov chains as time n → ∞ . In particular, under suitable easy-to-check conditions, we will see that a Markov chain possesses a limiting probability distribution, π = ( π j ) j ∈S , and that the chain, if started off initially with such a distribution will be a stationary stochastic process. We will also see that we can find π by merely solving a set of linear equations. 1.1 Communication classes and irreducibility for Markov chains For a Markov chain with state space S , consider a pair of states ( i,j ). We say that j is reachable from i , denoted by i → j , if there exists an integer n ≥ 0 such that P n ij > 0. This means that starting in state i , there is a positive probability (but not necessarily equal to 1) that the chain will be in state j at time n (that is, n steps later); P ( X n = j | X = i ) > 0. If j is reachable from i , and i is reachable from j , then the states i and j are said to communicate , denoted by i ←→ j . The relation defined by communication satisfies the following conditions: 1. All states communicate with themselves: P ii = 1 > 0. 1 2. Symmetry: If i ←→ j , then j ←→ i . 3. Transitivity: If i ←→ k and k ←→ j , then i ←→ j . The above conditions imply that communication is an example of an equivalence relation, meaning that it shares the properties with the more familiar equality relation “ = ”: i = i . If i = j , then j = i . If i = k and k = j , then i = j . Only condition 3 above needs some justification, so we now prove it for completeness: Suppose there exists integers n , m such that P n ik > 0 and P m kj > 0. Letting l = n + m , we conclude that P l ij ≥ P n ik P m kj > 0 where we have formally used the Chapman-Kolmogorov equations. The point is that the chain can (with positive probability) go from i to j by first going from i to k ( n steps) and then (independent of the past) going from k to j (an additional m steps). If we consider the rat in the open maze, we easily see that the set of states C 1 = { 1 , 2 , 3 , 4 } all communicate with one another, but state 0 only communicates with itself (since it is an absorbing state). Whereas state 0 is reachable from the other states, i → 0, no other state can be reached from state 0. We conclude that the state space S = { , 1 , 2 , 3 , 4 } can be broken up into two disjoint subsets, C 1 = { 1 , 2 , 3 , 4 } and C 2 = { } whose union equals S , and such that each of these subsets has the property that all states within it communicate. Disjoint means that their intersection contains no elements: C 1 ∩ C 2 = ∅ ....
View Full Document

This note was uploaded on 10/16/2010 for the course IEOR 4701 taught by Professor Karlsigma during the Summer '10 term at Columbia.

Page1 / 12

4 - Copyright c 2009 by Karl Sigman 1 Limiting distribution...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online