Stochastic

y0 i0 im1 pim1 im im this shows ym

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ion ⇡ (i). Let Xn be a realization of the Markov chain starting from the stationary distribution, i.e., P (X0 = i) = ⇡ (i). The next result says that if we watch the process Xm , 0 m n, backwards, then it is a Markov chain. 36 CHAPTER 1. MARKOV CHAINS Theorem 1.25. Fix n and let Ym = Xn Markov chain with transition probability m for 0 m n. Then Ym is a p(i, j ) = P (Ym+1 = j |Ym = i) = ˆ ⇡ (j )p(j, i) ⇡ (i) (1.13) Proof. We need to calculate the conditional probability. P (Ym+1 = im+1 |Ym = im , Ym = P (Xn = im 1 1 . . . Y0 = i0 ) = im+1 , Xn m = im , Xn m+1 = im 1 . . . Xn = i0 ) P (Xn m = im , Xn m+1 = im 1 . . . Xn = i0 ) (m+1) Using the Markov property, we see the numerator is equal to ⇡ (im+1 )p(im+1 , im )P (Xn m+1 = im 1 , . . . Xn = i0 |Xn m = im ) Similarly the denominator can be written as ⇡ (im )P (Xn m+1 = im 1 , . . . Xn = i0 |Xn m = im ) Dividing the last two formulas and noticing that the conditional probabilities cancel we have P (Ym+1 = im+1 |Ym = im , . . . Y0 = i0 )...
View Full Document

This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.

Ask a homework question - tutors are online