{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

handout_week15_a

# handout_week15_a - Stochastic Signals and Systems Markov...

This preview shows pages 1–4. Sign up to view the full content.

Stochastic Signals and Systems Markov Chains Virginia Tech Fall 2008 Markov Chains: Introduction Markov processes represent the simplest generalization of independent processes by permitting the outcome at any instant to depend only on the outcome that precedes it and none before that. Thus in a Markov process X ( t ) , the past has no influence on the future if the present is specified. That is, if for arbitrary times t 1 < t 2 < . . . < t k < t k + 1 , P [ X ( t k + 1 ) = x k + 1 | X ( t k ) = x k , . . . , X ( t 1 ) = x 1 ] = P [ X ( t k + 1 ) = x k + 1 | X ( t k ) = x k ] if X ( t ) is discrete-valued, and P [ a < X ( t k + 1 ) b | X ( t k ) = x k , . . . , X ( t 1 ) = x 1 ] = P [ a < X ( t k + 1 ) b | X ( t k ) = x k ] if X ( t ) is continuous-valued.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Markov Chains: Introduction One-dimensional random walk Sum process S n = X 1 + X 2 + . . . + X n = S n - 1 + X n P [ S n + 1 = s n + 1 | S n = s n , . . . , S 1 = s 1 ] = P [ S n + 1 = s n + 1 | S n = s n ] Markov Chains: Introduction If X ( t ) is a Markov process, then P [ X ( t 3 ) = x 3 , X ( t 2 ) = x 2 , X ( t 1 ) = x 1 ] = P [ X ( t 3 ) = x 3 | X ( t 2 ) = x 2 , X ( t 1 ) = x 1 ] P [ X ( t 2 ) = x 2 , X ( t 1 ) = x 1 ] = P [ X ( t 3 ) = x 3 | X ( t 2 ) = x 2 ] P [ X ( t 2 ) = x 2 , X ( t 1 ) = x 1 ] = P [ X ( t 3 ) = x 3 | X ( t 2 ) = x 2 ] P [ X ( t 2 ) = x 2 | X ( t 1 ) = x 1 ] P [ X ( t 1 ) = x 1 ] More generally, P [ X ( t k + 1 ) = x k + 1 , X ( t k ) = x k , . . . , X ( t 1 ) = x 1 ] = P [ X ( t k + 1 ) = x k + 1 | X ( t k ) = x k ] P [ X ( t k ) = x k | X ( t k - 1 ) = x k - 1 ] . . . P [ X ( t 2 ) = x 2 | X ( t 1 ) = x 1 ] P [ X ( t 1 ) = x 1 ]
Markov Chains: Introduction A special kind of Markov process is a Markov chain where the system can occupy a finite or countably infinite number of

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 7

handout_week15_a - Stochastic Signals and Systems Markov...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online