This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: STA 3007 Applied Probability 2005 Tutorial 6 1. The Long Run Behavior of Markov Chains (a) Regular Markov Matrices A Markov Matrix P is said to be regular if P k has all of its elements strictly postive for some power k. Theorem 2.1 Let P be a Regular Markov matrix on the states 0 , 1 , ··· ,N . Then the limiting distribution π = ( π ,π 1 , ··· ,π N ) is the unique nonnegative solution of the equations π j = N X k =0 π k P kj , j = 0 , 1 , ··· ,N N X k =0 π k = 1 (b) The Classification of States i. Irreducible Markov Chain a j is accessible from i ⇒ P ( n ) ij > 0 for some integer n b P ( n ) ij > 0 and P ( m ) ji > 0 for some integer n,m. ⇒ i ↔ j ⇒ i and j communicate Irreducible Markov Chain: all states communicate with each other. Equivalence class: a class contains elements that communicate with each other. ii. Periodicity of a Markov Chain a d(i)=period of state i =H.C.F. of all n > 0 such that P ( n ) ii > Example: From Lecture Notes: P 00 = 0 ,P (2) 00 = 0 ,P (3) 00 = 0...
View Full Document