Unformatted text preview: it Theorem. This sort of lumping of states becomes
particularly important in uncountably in nite state spaces, where each individual state
may be hit with probability 0 while sets of states can be hit with positive probability. In
such a case, by considering a set of states as a new lumped state, we can produce an atom.
Next let us look at a case where there is no pair of states with exactly the same transition
probabilities. This is the typical case; for example, in Example ..., no two states have the
same nextstate transition probabilities. But nearby states have nearly the same transition
probabilities. This will allow us to use a modi ed version of the trick above. We'll see that
it is enough for a set of states to have a common component."
1.82 Example. Consider the matrix P = 0 :1 :2 :34 1
115
2 @ :4 0 :6 A, and suppose we are 3 :3 :2 :5
interested in lumping the states in the set R = f2; 3g. Now since P 2; 6= P 3;
things are not as simple as before. But note that rows 2 and 3 of P are both at least
:3; 0; :5 = 0:8:375; 0; :625. In fact,
P 2; = :4; 0; :6 = :3; 0; :5 + :1; 0; :1 = 0:8:375; 0; :625 + 0:2:5; 0; :5
and
P 3; = 3:; :2; :5 = :3; 0; :5 + 0; :2; 0 = 0:8:375; 0; :625 + 0:20; 1; 0:
These equations express each of the distributions P 2; and P 3; as a mixture of the
distribution :375; 0; :625 with some other distribution. In other words, both distributions
P 2; and P 3; share the common component" 0:8:375; 0; :625. A useful interpretion
of these equations is as follows. Suppose we have access to a biased coin having probability
0.8 of Heads and probability 0.2 of Tails. In order to generate the next state of the chain,
given the present state is 2, we start by tossing the coin. If we get Heads, we then draw from
the distribution :375; 0; :625, and if we get Tails, we draw from the distribution :5; 0; :5.
Similarly, if we are now in state 3, we can generate the next state by tossing the same
coin, drawing from the distribution :375; 0; :625 if we get Heads, and drawing from the
distribution 0; 1; 0 if we get Tails.
With this description, there are now two scenarios under which we use precisely the
same distribution i.e., :375; 0; :625 to generate the next state:
1. Enter state 2 and get Heads from coin toss
2. Enter state 3 and get Heads from coin toss
Since these two scenarios lead to the same nextstate distribution, we can lump them
together into a new state.
So here is another way to conceptualize the running of this chain. At time t, say the
state is Xt . First we look to see whether we are in either of states 2 or 3, and if so we toss
~
the biased coin, getting the outcome C 2 fHeads; Tailsg. Then de ne Xt as follows:
1,
X 2 3g and
~
Xt = Xt if Xt = f2;orgifandt C f2;Heads C = Tails
if Xt 2 3
=
Stochastic Processes J. Chang, March 30, 1999 1.10. GENERAL STATE SPACE MARKOV CHAINS Page 143 ~
~
We can use Xt to generate the next state Xt+1 as follows. If Xt = 1, we draw Xt+1 from the
~ t = 2, then we know that Xt was 2 and C came
probability mass function :1; :5; :4. If X
~
out as Tails, so we use the m...
View
Full Document
 Spring '10
 DURRETT
 Multiplication, Markov Chains, Probability theory, Markov chain, J. Chang

Click to edit the document details