Lec17 - Stat 150 Stochastic Processes Spring 2009 Lecture...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Stat 150 Stochastic Processes Spring 2009 Lecture 17: Limit distributions for Markov Chains Lecturer: Jim Pitman Finite state space Markov chain X 0 ,X 1 ,... with state space S having N ele- ments, N < . Matrix P , P i ( X n = j ) = P n ( i,j ). Problem : Suppose you know the initial distribution λ of X 0 j = P ( X 0 = j ). Want to evaluate lim n →∞ P λ ( X n = j ). Notation: P λ ( · ) := i λ i P i ( · ) Questions : When does limit F exist? If F exists, how to evaluate it? We say P is regular if n : P n ( ) > 0 for all S . Equivalently , there exists for every i and j some sequence i 0 = i,i 1 ,i 2 ,...,i n = j with P ( i k k +1 ) > 0 for all 1 k n some path from i to j in exactly n steps. Obviously P n ( ) > 0 , = P m ( ) > 0 , i,j,m n : Let m = n + k , P n + k ( ) = X l P k ( i,l ) P n ( l,j ) > 0 because P n ( ) > 0 for all n and l P k ( ) = 1 P k ( ) > 0 for some l . Theorem: If P is a regular transition matrix on a finite set, then there ex- ists a unique probability distribution π on S such that πP = π ( π is called the stationary, equilibrium, invariant, or steady state
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 3

Lec17 - Stat 150 Stochastic Processes Spring 2009 Lecture...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online