ch2 - Long Run Behavior of Markov Chains Regular Markov...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Long Run Behavior of Markov Chains Chapter 2 2 Regular Markov matrices Chapter 2.1 3 Definition z A Markov matrix P is said to be regular if P k has all of its elements strictly positive for some power k. z The corresponding Markov chain is called a regular chain. z Suppose a regular chain has state space {0,1,2…, N}, then the limiting probability π = ( π 0 , π 1 , …, π Ν ) exists and is independent of the initial state of the chain. 4 Mathematical Description z Let P = ||P ij || z In terms of Markov chain {X n } N ..., 0,1, j for , 0 lim ) ( = > = j n n ij P π {} N ..., 1, 0, j for 0 | lim 0 = > = = = j n n i X j X P
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
5 Example: social class problem 0.45 0.50 0.05 0.25 0.70 0.05 0.10 0.50 0.40 Upper Middle Lower Upper Middle Lower Son’s Class Father’s Class 6 What fraction of people in the middle class? 2981 . 625 . 0769 . 2981 . 625 . 0769 . 2978 . 625 . 0772 . = 8 P Therefore, in the long run, approximately 62.5% of the population are middle class. 7 Example regular Markov matrix + + + + + + + + + + + + + + = 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ~ 1 . 0 0 0 0 0 9 . 1 . 0 0 0 0 0 9 . 0 1 . 0 0 0 0 9 . 0 0 1 . 0 0 0 9 . 0 0 0 1 . 0 0 9 . 0 0 0 0 1 . 0 9 . 0 0 0 0 0 1 . . 9 P 8 The shape of P 2 , P 4 and P 8 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + = + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + = + + + + + + + + + + + + + + + + + + + + + = 8 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 , 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 P P P 4 Therefore, P is a regular Markov matrix
Background image of page 2
9 Regular matrix criteria z Every Markov matrix with N states that satisfies the following 2 conditions is regular For every pair of states i, j there is a path k 1 , …, k r connecting them with positive probability. There is at least one state j for which P jj > 0. 10 Theorem 2.1 z Let P be a regular Markov matrix on the states 0, 1, …, N. Then the limiting distribution π = ( π 0 , π 1 , …, π Ν ) is the unique nonnegative solution of the equations 1 ,..., 1 , 0 , 0 0 = = = = = N k k N k kj k j N j P π 11 Proof z Because the Markov Chain is regular, we have a limiting distribution. We consider z Now, we let n tend to infinity to yield N j P P P N k kj n ik n ij ,..., 0 , 0 ) 1 ( ) ( = = = N j P N k kj k j ,..., 1 , 0 , 0 = = = 12 Proof of the uniqueness z Suppose there is another limiting distribution called θ . We want to show that θ = π . . 0 0 0 ) ( 0 ) 2 ( 0 0 ) 2 ( 00 0 l l l l l l l l l l l l l l θ = = = = = = = = ∑∑ = = = = = = == = N k k N k k N k n k N k k N k k k N k k N j j N k kj k N j j j k k k P P P P P P P
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
13 Example: social class matrix 0.45 0.50 0.05 0.25 0.70 0.05 0.10 0.50 0.40 2 1 0 2 1 0 = P What is the limiting probability? 14 Social class problem z By Theorem 2.1, we set up the following system of linear equations z Since one equation must be redundant, we arbitrarily strike out one of them. z After solving the system, we obtain π 0 = 1/13, π 1 =5/8 and π 2 = 31/104. 1 45 . 25 . 10 . 50 . 70 . 50 . 05 . 05 . 40 . 2 1 0 2 2 1 0 1 2 1 0 0 2 1 0 = + + = + + = + + = + + π 15 Interpretation of the limiting dis. z After the process has been in operation for a long duration, the probability of finding the process in state j is π j , irrespective of starting state. z If each visit to state j incurs a “cost” of c j , then the long run mean cost per unit time associated with this Markov Chain is = N j j j c 0 16 Example: Reliability and Redundancy z An airline reservation system has two computers only one of which is in operation at any given time.
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 05/21/2011 for the course STA 3007 taught by Professor Kb during the Spring '11 term at CUHK.

Page1 / 16

ch2 - Long Run Behavior of Markov Chains Regular Markov...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online