Stat150_spring06_markov_intro

Stat150_Spring06_Markov_intro
Download Document
Showing pages : 1 - 10 of 45
This preview has blurred sections. Sign up to view the full version! View Full Document
Statistics 150: Spring 2007 February 7, 2007 0-1
Background image of page 1
1 Markov Chains Let { X 0 ,X 1 , ···} be a sequence of random variables which take values in some countable set S , called the state space . Each X n is a discrete random variables that takes one of N possible values, where N = | S | ; it may be the case that N = . Definition. The process X is a Markov Chain if it satisfies the Markov condition : P ( X n = s | X 0 = x 0 ,X 1 = x 1 ,...,X n - 1 = x n - 1 ) = P ( X n = s | X n - 1 = x n - 1 ) for all n 1 and all s,x 1 , ··· ,x n - 1 S . 1
Background image of page 2
Definition. The chain X is called homogeneous if P ( X n +1 = j | X n = i ) = P ( X 1 = j | X 0 = i ) for all n,i,j . The transition matrix P = ( p ij ) is the | S | × | S | matrix of transition probabilities p ij = P ( X n +1 = j | X n = i ) . Henceforth, all Markov chains are assumed homogeneous unless otherwise specified. 2
Background image of page 3
Theorem. The transition matrix P is a stochastic matrix, which is to say that: (a) P has non-negative entries, or p ij 0 for all i,j , (b) P has row sums equal to one, or j p ij = 1 for all i. Proof. An easy exercise. ± Definition. The n-step transition matrix P ( m,m + n ) = ( p ij ( m,m + n )) is the matrix of n-step transition probabilities p ij ( m,m + n ) = P ( X m + n = j | X m = i ) . 3
Background image of page 4
Theorem. Chapman-Kolmogorov equations. p ij ( m,m + n + r ) = X k p ik ( m,m + n ) p kj ( m + n,m + n + r ) . Therefore, P ( m,m + n + r ) = P ( m,m + n ) P ( m + n,m + n + r ) , and P ( m,m + n ) = P n , the n th power of P . Proof. We have as required that p ij ( m,m + n + r ) = P ( X m + n + r = j | X m = i ) = X k P ( X m + n + r = j,X m + n = k | X m = i ) 4
Background image of page 5
= X k P ( X m + n + r = j | X m + n = k,X m = i ) P ( X m + n = k | X m = i ) = X k P ( X m + n + r = j | X m + n = k ) P ( X m + n = k | X m = i ) where we have used the fact that P ( A T B | C ) = P ( A | B T C ) P ( B | C ) , together with the Markov property. The established equation may be written in matrix form as P ( m,m + n + r ) = P ( m,m + n ) P ( m + n,m + n + r ) , and it follows by iteration that P ( m,m + n ) = P n . ± 5
Background image of page 6
Let μ ( n ) i = P ( X n = i ) be the mass function of X n , and write μ n for the row vector with entries ( μ n i : i S ) . Lemma μ ( m + n ) = μ ( m ) P n , and hence μ ( n ) = μ (0) P n . 6
Background image of page 7
2 Exercises 1) A die is rolled repeatedly. Which of the following are Markov chains? For those that are, supply the transition matrix. (a) The largest number X n shown up to the n th roll. (b) The number N n of sixes in n rolls. (c) At time r , the time C r since the most recent six. (d) At time r , the time B r until the next six. 7
Background image of page 8
2) Let X be a Markov chain on S , and let T be a random variables taking values in { 0 , 1 , 2 , ···} with the property that the indicator function 1 { T = n } of the event that T = n is a function of the variables X 1 ,X 2 , ··· ,X n . Such a random variables T is called a stopping time , and the above definition requires that it is decidable whether or not T = n with a knowledge only of the past and present,
Background image of page 9
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.