This preview shows page 1. Sign up to view the full content.
Unformatted text preview: hain by enlarging the state space and considering the sequence
of random vectors R0 ; L0 ; R1 ; L1 ; : : :. This sequence does form a Markov chain. What
is its probability transition function and stationary distribution? Now, assuming the Basic
Limit Theorem applies here, what is the limiting distribution of Ln as n ! 1? This is
the famous lengthbiased sampling" distribution. 1.6 Irreducibility, periodicity, and recurrence
We now turn to the de nition of irreducibility. Let i and j be two states. We say that j
is accessible from i if it is possible with positive probability for the chain ever to visit
state j if the chain starts in state i, or, in other words,
1 f fXn = j g j X0 = ig 0: P n=0 Clearly an equivalent condition is
1.27 1
X n=0 4
P n i; j = 1
X
n=0 P fXn = j j X0 = ig 0: 1.28 Exercise. Prove the last assertion.
We say i communicates with j if j is accessible from i and i is accessible from j .
1.29 Exercise. Show that the relation communicates with" is an equivalence relation.
That is, show that the communicates with" relation is re exive, symmetric, and transitive. Stochastic Processes J. Chang, March 30, 1999 1. MARKOV CHAINS Page 114 We say that the Markov chain is irreducible if all pairs of states communicate.
Recall that an equivalence relation on a set induces a partition of that set into equivalence classes. Thus, by Exercise 1.29, the state space S may be partitioned into what we
will call communicating classes," or simply classes." The chain is irreducible if there is
just one communicating class, that is, the whole state space S. Note that whether or not
a Markov chain is irreducible is determined by the state space S and the transition matrix
P i; j ; the initial distribution 0 is irrelevant. In fact, all that matters is the pattern of
zeroes in the transition matrix.
Why do we require irreducibility in the Basic Limit Theorem" 1.17? Here is a trivial
example of how the 1 0 conclusion can fail if we do not assume irreducibility. Let S = f0; 1g
and let P = 0 1 : Clearly the resulting Markov chain is not irreducible. Also, clearly
the conclusion of the Basic Limit Theorem does not hold; that is, n does not approach
any limit that is independent of 0 . In fact, n = 0 for all n.
Next, to discuss periodicity, let's begin with another trivial example: take S = f0; 1g 0 1 : The conclusion of the Basic Limit Theorem does not hold
again, and let P = 1 0
here: for example, if 0 = 1; 0, then n = 1; 0 if n is even and n = 0; 1 if n is odd.
So in this case n 1 alternates between the two values 0 and 1 as n increases, and hence
does not converge to anything. The problem in this example is not lack of irreducibility;
clearly this chain is irreducible. So, assuming the Basic Limit Theorem is true, the chain
must not be aperiodic! That is, the chain is periodic. The trouble stems from the fact
that, starting from state 1 at time 0, the chain can visit state 1 only at even times. The
same holds for state 2.
Given a Markov chain fX0 ; X1 ; : : :g, de ne the period of a state i to be di = gcdfn : P ni; i 0g: Note th...
View
Full
Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).
 Spring '10
 DURRETT
 Multiplication, Markov Chains

Click to edit the document details