This preview shows page 1. Sign up to view the full content.
Unformatted text preview: it will be the distribution at all times n 1.
Stationary distributions have a special importance in the theory of Markov
chains, so we will use a special letter ⇡ to denote solutions of the equation
⇡p = ⇡.
To have a mental picture of what happens to the distribution of probability
when one step of the Markov chain is taken, it is useful to think that we have
q (i) pounds of sand at state i, with the total amount of sand i q (i) being one
pound. When a step is taken in the Markov chain, a fraction p(i, j ) of the sand
at i is moved to j . The distribution of sand when this has been done is
q (i)p(i, j )
i If the distribution of sand is not changed by this procedure q is a stationary
Example 1.17. Weather chain. To compute the stationary distribution we
want to solve
= ⇡1 ⇡2
Multiplying gives two equations: .6⇡1 + .2⇡2 = ⇡1
.4⇡1 + .8⇡2 = ⇡2
Both equations reduce to .4⇡1 = .2⇡2 . Since we want ⇡1 + ⇡2 = 1, we must
have .4⇡1 = .2 .2⇡1 , and hence
⇡1 = .2
.2 + .4
3 ⇡2 = .4
.2 + .4
3 19 1.4. STATIONARY DISTRIBUTIO...
View Full Document
- Spring '10
- The Land