This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Chapter 14 Feller Processes Section 14.1 fulfills the demand, made last time, for an example of a Markov process which is not strongly Markovian. Section 14.2 makes explicit the idea that the transition kernels of a Markov process induce a kernel over sample paths, mostly to fix notation for later use. Section 14.3 defines Feller processes, which link the cadlag and strong Markov properties. 14.1 An Example of a Markov Process Which Is Not Strongly Markovian This is taken from Fristedt and Gray (1997, pp. 626627). Example 138 We will construct an R 2-valued Markov process on [0 , ) which is not strongly Markovian. Begin by defining the following map from R to R 2 : f ( w ) = ( w, 0) w (sin w, 1- cos w ) < w < 2 ( w- 2 , 0) w 2 (14.1) When w is less than zero or above 2 , f ( w ) moves along the x axis of the plane; in between, it moves along a circle of radius 1, centered at (0 , 1) , which it enters and leaves at the origin. Notice that f is invertible everywhere except at the origin, which is ambiguous between w = 0 and w = 2 . Let X ( t ) = f ( W ( t ) + ) , where W ( t ) is a standard Wiener process. At all t , P ( W ( t ) + = 0) = P ( W ( t ) + = 2 ) = 0 , so, with probability 1, X ( t ) can be inverted to get W ( t ) . Since W ( t ) is a Markov process, it follows that P ( X ( t + h ) B | X ( t ) = x ) = P ( X ( t + h ) B |F X t ) almost surely, i.e., X is Markov. Now consider = inf t X ( t ) = (0 , 0) , the hitting time of the origin. This is clearly an F X-optional time, and equally clearly almost surely finite, because, with probability 1, W ( t ) will leave the interval (- , ) within a finite 72 CHAPTER 14. FELLER PROCESSES 73 time. But, equally clearly, the future behavior of X will be very different if it hits the origin because W = or because W =- , which cannot be determined just from X . Hence, there is at least one optional time at which X is not strongly Markovian, so X is not a strong Markov process. 14.2 Markov Families We have been fairly cavalier about the idea of a Markov process having a par- ticular initial state or initial distribution, basically relying on our familiarity with these ideas from elementary courses on stochastic processes. For future purposes, however, it is helpful to bring this notions formally within our general framework, and to fix some notation. Definition 139 (Initial Distribution, Initial State) Let be a Borel space with -field X , T be a one-sided index set, and t,s be a collection of Markovian transition kernels on . Then the Markov process with initial distribution , X , is the Markov process whose finite-dimensional distributions are given by the action of t,s on . That is, for all t 1 t 2 . . . t n , X (0) , X ( t 1 ) , X ( t 2 ) , . . . X ( t n ) ,t 1 t 1 ,t 2 . . . t n- 1 ,t n (14.2) If = ( x- a ) , the delta distribution at a , then we write...
View Full Document
This note was uploaded on 12/20/2011 for the course STAT 36-754 taught by Professor Schalizi during the Spring '06 term at University of Michigan.
- Spring '06