Unformatted text preview: ion ⇡ (i). Let Xn
be a realization of the Markov chain starting from the stationary distribution,
i.e., P (X0 = i) = ⇡ (i). The next result says that if we watch the process Xm ,
0 m n, backwards, then it is a Markov chain. 36 CHAPTER 1. MARKOV CHAINS Theorem 1.25. Fix n and let Ym = Xn
Markov chain with transition probability m for 0 m n. Then Ym is a p(i, j ) = P (Ym+1 = j |Ym = i) =
ˆ ⇡ (j )p(j, i)
⇡ (i) (1.13) Proof. We need to calculate the conditional probability.
P (Ym+1 = im+1 |Ym = im , Ym
= P (Xn = im 1 1 . . . Y0 = i0 ) = im+1 , Xn m = im , Xn m+1 = im 1 . . . Xn = i0 )
P (Xn m = im , Xn m+1 = im 1 . . . Xn = i0 ) (m+1) Using the Markov property, we see the numerator is equal to
⇡ (im+1 )p(im+1 , im )P (Xn m+1 = im 1 , . . . Xn = i0 |Xn m = im ) Similarly the denominator can be written as
⇡ (im )P (Xn m+1 = im 1 , . . . Xn = i0 |Xn m = im ) Dividing the last two formulas and noticing that the conditional probabilities
cancel we have
P (Ym+1 = im+1 |Ym = im , . . . Y0 = i0 )...
View Full Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.
- Spring '10
- The Land