xn to see that ty is a stopping time note that ty

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: f the process up to that time: X0 , . . . , Xn . To see that Ty is a stopping time note that {Ty = n} = {X1 6= y, . . . , Xn 1 6= y, Xn = y } and that the right-hand side can be determined from X0 , . . . , Xn . Since stopping at time n depends only on the values X0 , . . . , Xn , and in a Markov chain the distribution of the future only depends on the past through the current state, it should not be hard to believe that the Markov property holds at stopping times. This fact can be stated formally as: Theorem 1.2. Strong Markov property. Suppose T is a stopping time. Given that T = n and XT = y , any other information about X0 , . . . XT is irrelevant for predicting the future, and XT +k , k 0 behaves like the Markov chain with initial state y . Why is this true? To keep things as simple as possible we will show only that P (XT +1 = z |XT = y, T = n) = p(y, z ) Let Vn be the set of vectors (x0 , . . . , xn ) so that if X0 = x0 , . . . , Xn = xn , then T = n and XT = y . Breaking things down...
View Full Document

Ask a homework question - tutors are online