STA3007_0506_t06

STA3007_0506_t06 - STA 3007 Applied Probability 2005...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STA 3007 Applied Probability 2005 Tutorial 6 1. The Long Run Behavior of Markov Chains (a) Regular Markov Matrices A Markov Matrix P is said to be regular if P k has all of its elements strictly postive for some power k. Theorem 2.1 Let P be a Regular Markov matrix on the states 0 , 1 , ··· ,N . Then the limiting distribution π = ( π ,π 1 , ··· ,π N ) is the unique nonnegative solution of the equations π j = N X k =0 π k P kj , j = 0 , 1 , ··· ,N N X k =0 π k = 1 (b) The Classification of States i. Irreducible Markov Chain a j is accessible from i ⇒ P ( n ) ij > 0 for some integer n b P ( n ) ij > 0 and P ( m ) ji > 0 for some integer n,m. ⇒ i ↔ j ⇒ i and j communicate Irreducible Markov Chain: all states communicate with each other. Equivalence class: a class contains elements that communicate with each other. ii. Periodicity of a Markov Chain a d(i)=period of state i =H.C.F. of all n > 0 such that P ( n ) ii > Example: From Lecture Notes: P 00 = 0 ,P (2) 00 = 0 ,P (3) 00 = 0...
View Full Document

This note was uploaded on 05/21/2011 for the course STA 3007 taught by Professor Kb during the Spring '11 term at CUHK.

Page1 / 5

STA3007_0506_t06 - STA 3007 Applied Probability 2005...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online