{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# 2003 wk2 - ASOC ACTL2003 Support Notes Week 2 written by...

This preview shows pages 1–2. Sign up to view the full content.

ASOC ACTL2003 Support Notes: Week 2 written by Tim Yip, Andy Wong and Andrew Teh 0.1 Markov Chains 0.1.1 Periodicity Definition: State i has period d , where d is the greatest common divisor for all n for which P n ii > 0 . For example, if P n ii > 0 for n = 2 , 4 , 5 , then d = gcd (2 , 4 , 5) = 1 . A state with period 1 is aperiodic Two states that communicate have the same period A recurrent state that has expected time of return to itself that is finite is positive recurrent If the Markov Chain has a finite number of states (this is generally the case in this course), all recurrent states are positive recurrent A state that is positive recurrent and aperiodic is ergodic 0.1.2 Limiting Probabilities The limiting probability is defined as π j = lim n →∞ P n ij As the notation suggests, it is independent of i . This limiting probability exists if the Markov Chain is irreducible and ergodic. π j can be interpreted as the long run proportion of time that the process is in state j How to find π j :

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 3

2003 wk2 - ASOC ACTL2003 Support Notes Week 2 written by...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online