This preview shows page 1. Sign up to view the full content.
Unformatted text preview: r q of initial
probabilities. If there are k states, then pn (x, y ) is a k ⇥ k matrix. So to make
the matrix multiplication work out right, we should take q as a 1 ⇥ k matrix or
a “row vector.” 18 CHAPTER 1. MARKOV CHAINS Example 1.15. Consider the weather chain (Example 1.3) and suppose that
the initial distribution is q (1) = 0.3 and q (2) = 0.7. In this case
✓
◆
.6 .4
.3 .7
= .32 .68
.2 .8
since .3(.6) + .7(.2) = .32
.3(.4) + .7(.8) = .68 Example 1.16. Consider the social mobility chain (Example 1.4) and suppose
that the initial distribution: q (1) = .5, q (2) = .2, and q (3) = .3. Multiplying
the vector q by the transition probability gives the vector of probabilities at
time 1.
0
1
.7 .2 .1
.5 .2 .3 @.3 .5 .2A = .47 .32 .21
.2 .4 .4
To check the arithmetic note that the three entries on the righthand side are
.5(.7) + .2(.3) + .3(.2) = .35 + .06 + .06 = .47
.5(.2) + .2(.5) + .3(.4) = .10 + .10 + .12 = .32
.5(.1) + .2(.2) + .3(.4) = .05 + .04 + .12 = .21
If qp = q then q is called a stationary distribution. If the distribution at
time 0 is the same as the distribution at time 1, then by the Markov property...
View
Full
Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).
 Spring '10
 DURRETT
 The Land

Click to edit the document details