This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ent Markov chain has a unique stationary distribution given by j = E 1T :
jj Proof: For the uniqueness, let be a stationary distribution. We start with the relation X
i iP t i; j = j ; which holds for all t. Averaging this over values of t from 1 to n gives
n
X 1X t
i
P i; j = j : n t=1 i By Corollary 1.58 and the Dominated Convergence Theorem , the left side of the last
equation approaches
X
i E 1T = E 1T
jj i jj as n ! 1. Thus, j = 1=E j Tj , which establishes the uniqueness assertion.
We begin the proof of existence by doing the proof in the special case where the state
space is nite. The proof is simpler here than in the general case, which involves some
distracting technicalities.
So assume for the moment that the state space is nite. We begin again with Corollary
1.58, which says that
n
1 X P t i; j ! 1 :
1.60
n
E T
t=1 jj However, the sum over all j of the left side of 1.60 is 1, for all n. Therefore,
X1
= 1:
j E j Tj
Stochastic Processes J. Chang, March 30, 1999 1.9. A SLLN FOR MARKOV CHAINS Page 133 That's good, since we want our claimed stationary distribution to be a probability distribution.
Next we write out the matrix equation P t P = P t+1 as follows: X 1.61 k P t i; kP k; j = P t+1 i; j : Averaging this over t = 1; : : : ; n gives
n
X"1 X n t=1 k P ti; k n
1 X P t+1 i; j :
P k; j = n
t=1 Taking the limit as n ! 1 of the last equation and using 1.60 again gives X k 1 E k Tk P k; j = E 1T :
jj Thus, our claimed stationary distribution is indeed stationary.
Finally, let's see how to handle the in nite state space case. Let A S be a nite subset
of the state space. Summing 1.60 over j 2 A gives the inequality
X1
1:
E j Tj
j 2A
Therefore, since this is true for all subsets A, we get
X1
=: C 1:
j 2S E j Tj
By the assumption of positive recurrence, we have C 0; in a moment we'll see that C = 1.
The same sort of treatment of 1.61 i.e., sum over k 2 A, average over t = 1; : : : ; n, let
n ! 1, and then take supremum over subsets A of S gives the inequality X 1.62 k 1 E k Tk P k; j E 1T :
jj However, the sum over all j of the left side of 1.62 is X k 1 E k Tk X
j P k; j = X k 1 E k Tk ; which is the same as the sum of the right side of 1.62. Thus, the left and right sides of
1.62 must be the same for all j . From this we may conclude that the distribution
1
j = C
~
Stochastic Processes 1
E j Tj J. Chang, March 30, 1999 1. MARKOV CHAINS Page 134 is stationary, so that, in particular, we know that our chain does have a stationary distribution. Thus, by the uniqueness assertion we proved above, we must have C = 1, and we
are done.
1.63 Exercise. Consider a knight sitting on the lower left corner square of an ordinary
8 8 chess board. The knight has residual froglike tendencies, left over from an old spell an older witch cast upon him. So he performs a random walk on the chess board, at each time
choosing a random move uniformly distributed over the set of his possib...
View
Full
Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.
 Spring '10
 DURRETT
 Multiplication, Markov Chains

Click to edit the document details