This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 11 Markov Chains ENCS6161  Probability and Stochastic Processes Concordia University Markov Processes A Random Process is a Markov Process if the future of the process given the present is independent of the past, i.e., if t 1 < t 2 < < t k < t k +1 , then P [ X ( t k +1 ) = x k +1  X ( t k ) = x k , , X ( t 1 ) = x 1 ] = P [ X ( t k +1 ) = x k +1  X ( t k ) = x k ] if X ( t ) is discreetevalued or f X ( t k +1 ) ( x k +1  X ( t k ) = x k , , X ( t 1 ) = x 1 ) = f X ( t k +1 ) ( x k +1  X ( t k ) = x k ) if X ( t ) is continuousvalued. ENCS6161 p.1/12 Markov Processes Example: S n = X 1 + X 2 + + X n S n +1 = S n + X n +1 P [ S n +1 = s n +1  S n = s n , , S 1 = s 1 ] = P [ S n +1 = s n +1  S n = s n ] So S n is a Markov process. Example: The Poisson process is a continuoustime Markov process. P [ N ( t k +1 ) = j  N ( t k ) = i, , N ( t 1 ) = x 1 ] = P [ j i events in t k +1 t k ] = P [ N ( t k +1 ) = j  N ( t k ) = i ] An integervalued Markov process is called Markov Chain....
View
Full
Document
This note was uploaded on 01/15/2011 for the course ECE 6161 taught by Professor Khkjk during the Winter '10 term at Concordia Canada.
 Winter '10
 khkjk

Click to edit the document details