Unformatted text preview: k, which corresponds to the result of perfect shu ing
or shu ing in nitely many times". Suppose, for illustration, that the total variation
distance k7 , k happens to be 0:17. This tells us that the probability of any event 
for example, the probability of winning any speci ed card game  using a deck shu ed
7 times di ers by at most 0.17 from the probability of the same event using a perfectly
shu ed deck.
1.53 Exercise. Let 0 and 0 be probability mass functions on S, and de ne 1 = 0 P
and 1 = 0 P , where P is a probability transition matrix. Show that k1 , 1 k k0 , 0 k. Stochastic Processes J. Chang, March 30, 1999 1. MARKOV CHAINS Page 126 To introduce the coupling method, let Y0 ; Y1 ; : : : be a Markov chain with the same
probability transition matrix as X0 ; X1 ; : : :, but let Y0 have the distribution ; that is, we
start the Y chain o in the initial distribution instead of the initial distribution 0 of the
X chain. Note that fYn g is a stationary Markov chain, and, in particular, that Yn has the
distribution for all n. Further let the Y chain be independent of the X chain.
Roughly speaking, we want to show that for large n, the probabilistic behavior of Xn
is close to that of Yn. The next result says that we can do this by showing that for large n,
the X and Y chains have met with high probability by time n. De ne the coupling time T
to be the rst time at which Xn equals Yn: T = inf fn : Xn = Yng;
where of course we de ne T = 1 if Xn 6= Yn for all n.
1.54 Lemma The coupling inequality" . For all n we have kn , k PfT ng:
Proof: De ne the process fYn g by Yn = Yn if n T .
Xn if n T It is easy to see that fYn g is a Markov chain, and it has the same probability transition
matrix P i; j as fXn g has. To understand this, start by thinking of the X chain as a
frog carrying a table of random numbers jumping around in the state space. The frog uses
his table of iid uniform random numbers to generate his path as we described earlier in
the section about specifying and simulating Markov chains. He uses the rst number in
his table together with his initial distribution 0 to determine X0 , and then reads down
successive numbers in the table to determine the successive transitions on his path. The
Y frog does the same sort of thing, except he uses his own, di erent table of uniform
random numbers so he will be independent of the X frog, and he starts out with the initial
distribution instead of 0 . How about the Y frog? Is he also doing a Markov chain?
Well, is he choosing his transitions using uniform random numbers like the other frogs?
Yes, he is; the only di erence is that he starts by using Y 's table of random numbers and
hence he follows Y until the coupling time T , after which he stops reading numbers from
Y 's table and switches to X 's table. But big deal; he is still generating his path by using
uniform random numbers in the way required to generate a Markov chain. The chain fYn g
is stationary: Y0 , since Y0 = Y0 and Y0 . Thus, Yn for all n. so that for A S
we have nA , A = PfXn 2 Ag , PfYn 2 Ag
= PfXn 2 A; T ng + PfXn 2 A; T ng
,PfYn 2 A; T ng , PfYn 2 A; T ng:
Stochastic Processes J. Chang, March 30, 1999 1.8. PROOF OF THE BASIC LIMIT THEOREM Page 127 Howev...
View
Full Document
 Spring '10
 DURRETT
 Multiplication, Markov Chains, Probability theory, Markov chain, J. Chang

Click to edit the document details