{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# ch11 - Chapter 11 Markov Chains ENCS6161 Probability and...

This preview shows pages 1–5. Sign up to view the full content.

Chapter 11 Markov Chains ENCS6161 - Probability and Stochastic Processes Concordia University

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Markov Processes A Random Process is a Markov Process if the future of the process given the present is independent of the past, i.e., if t 1 < t 2 < · · · < t k < t k +1 , then P [ X ( t k +1 ) = x k +1 | X ( t k ) = x k , · · · , X ( t 1 ) = x 1 ] = P [ X ( t k +1 ) = x k +1 | X ( t k ) = x k ] if X ( t ) is discreete-valued or f X ( t k +1 ) ( x k +1 | X ( t k ) = x k , · · · , X ( t 1 ) = x 1 ) = f X ( t k +1 ) ( x k +1 | X ( t k ) = x k ) if X ( t ) is continuous-valued. ENCS6161 – p.1/12
Markov Processes Example: S n = X 1 + X 2 + · · · + X n S n +1 = S n + X n +1 P [ S n +1 = s n +1 | S n = s n , · · · , S 1 = s 1 ] = P [ S n +1 = s n +1 | S n = s n ] So S n is a Markov process. Example: The Poisson process is a continuous-time Markov process. P [ N ( t k +1 ) = j | N ( t k ) = i, · · · , N ( t 1 ) = x 1 ] = P [ j - i events in t k +1 - t k ] = P [ N ( t k +1 ) = j | N ( t k ) = i ] An integer-valued Markov process is called Markov Chain. ENCS6161 – p.2/12

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Discrete-time Markov Chain X n is a discrete-time Markov chain starts at n = 0 with P i (0) = P [ X 0 = i ] , i = 0 , 1 , 2 , · · · Then from the Markov property, P [ X n = i
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 13

ch11 - Chapter 11 Markov Chains ENCS6161 Probability and...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online