To have a mental picture of what happens to the

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: it will be the distribution at all times n 1. Stationary distributions have a special importance in the theory of Markov chains, so we will use a special letter ⇡ to denote solutions of the equation ⇡p = ⇡. To have a mental picture of what happens to the distribution of probability when one step of the Markov chain is taken, it is useful to think that we have P q (i) pounds of sand at state i, with the total amount of sand i q (i) being one pound. When a step is taken in the Markov chain, a fraction p(i, j ) of the sand at i is moved to j . The distribution of sand when this has been done is X qp = q (i)p(i, j ) i If the distribution of sand is not changed by this procedure q is a stationary distribution. Example 1.17. Weather chain. To compute the stationary distribution we want to solve ✓ ◆ .6 .4 ⇡1 ⇡2 = ⇡1 ⇡2 .2 .8 Multiplying gives two equations: .6⇡1 + .2⇡2 = ⇡1 .4⇡1 + .8⇡2 = ⇡2 Both equations reduce to .4⇡1 = .2⇡2 . Since we want ⇡1 + ⇡2 = 1, we must have .4⇡1 = .2 .2⇡1 , and hence ⇡1 = .2 1 = .2 + .4 3 ⇡2 = .4 2 = .2 + .4 3 19 1.4. STATIONARY DISTRIBUTIO...
View Full Document

Ask a homework question - tutors are online