lecture-10 - Chapter 10 Alternate Characterizations of...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Chapter 10 Alternate Characterizations of Markov Processes This lecture introduces two ways of characterizing Markov pro- cesses other than through their transition probabilities. Section 10.1 addresses a question raised in the last class, about when being Markovian relative to one filtration implies being Markov relative to another. Section 10.2 describes discrete-parameter Markov processes as transformations of sequences of IID uniform variables. Section 10.3 describes Markov processes in terms of measure- preserving transformations (Markov operators), and shows this is equivalent to the transition-probability view. 10.1 The Markov Property Under Multiple Fil- trations In the last lecture, we defined what it is for a process to be Markovian relative to a given filtration F t . The question came up in class of when knowing that X Markov with respect to one filtration F t will allow us to deduce that it is Markov with respect to another, say G t . To begin with, let’s introduce a little notation. Definition 106 (Natural Filtration) The natural filtration for a stochastic process X is F X t σ ( { X u , u t } ) . Obviously, every process X is adapted to F X t . Definition 107 (Comparison of Filtrations) A filtration G t is finer than or more refined than or a refinement of F t , F t ≺ G t , if, for all t , F t G t , and at least sometimes the inequality is strict. F t is coarser or less fine than G t . If F t ≺ G t or F t = G t , we write F t ± G t . 54
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
CHAPTER 10. MARKOV CHARACTERIZATIONS 55 Lemma 108 If X is adapted to G t , then F X t ± G t . Proof: For each t , X t is G t measurable. But F X t is, by construction, the smallest σ -algebra with respect to which X t is measurable, so, for every t , F X t ⊆ G t , and the result follows. ± Theorem 109 If X is Markovian with respect to G t , then it is Markovian with respect to any coarser filtration to which it is adapted, and in particular with respect to its natural filtration. Proof: Use the smoothing property of conditional expectations: For any two σ -fields F ⊂ G and random variable Y , E [ Y |F ] = E [ E [ Y |G ] |F ] a.s. So, if F t is coarser than G t , and X is Markovian with respect to the latter, for any function f L 1 and time s > t , E [ f ( X s ) |F t ] = E [ E [ f ( X s ) |G t ] |F t ] a.s. (10.1) = E [ E [ f ( X s ) | X t ] |F t ] (10.2) = E [ f ( X s ) | X t ] (10.3) where the last line uses the facts that (i) E [ f ( X s ) | X t ] is a function X t , (ii) X is adapted to F t , so X t is F t -measurable, and (iii) if Y is F -measurable, then E [ Y |F ] = Y . Since this holds for all f L 1 , it holds in particular for 1 A , where A is any measurable set, and this established the conditional independence which constitutes the Markov property. Since (Lemma 108) the natural filtration is the coarsest filtration to which X is adapted, the remainder of the theorem follows. ± The converse is false, as the following example shows.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 6

lecture-10 - Chapter 10 Alternate Characterizations of...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online