EMSE 208 Lecture 4 - Markov Chains

EMSE 208 Lecture 4 - Markov Chains - 1 Markov Chains...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 Markov Chains Definition A Stochastic Process {X(t), t T} is a collection of random variables. That is, for each t T, X{t) is a random variable. The values of t is often interpreted as time (though not necessarily) and thus we say that X(t) is the state of the process at time t. T is called the index set . If T is a countable set, this stochastic process is said to be a discrete-time process . If T is a subset of the real line. The stochastic process is said to be a continuous-time process . A state space is for a stochastic process is the set of all possible values that that the random variable X(t) can assume. This may also be countable or continuous. 2 Markov Chains Definition A Markov Chain is a Stochastic Process whose index set is the set of nonnegative integers n=0,1 ,2, and whose states space is countable or finite. We denote the process {X n , n=0,1,2,} and If X n =i , we say that the process is in state i at time n We define for all states i n-1 , ,i , i, j and n 0 } | Pr{ } ,..., , | Pr{ 1 1 1 1 i X j X i X i X i X j X P n n n n n n ij = = = = = = = = +-- + 3 Markov Chains Definition P ij are called the one step transition probabilities and represent the probability that a process that is in state i will transition (in one time unit) to state j And we define the one step transition matrix as = = = 1 , j ij ij P P 0,1,2,... i for = i2 i1 i0 12 11 10 02 01 00 P P P P P P P P P , P P ij Rows sum to 1 but not necessarily so for columns 4 Markov Chains Examples: Weather Forecasting If Pr{rain tomorrow | rain today} = If Pr{rain tomorrow | no rain today} = Define: - state 0 (Rain), state 1 (No Rain) X n is the state of weather on day n -- = 1 1 P 5 Markov Chains Examples: Communications System Transmission of 0s and 1s each must pass through stages, let p denote the probability that a digit entering a stage is unchanged. X n is the state of the digit at time n -- = p p p p P 1 1 6 Markov Chains Examples: Predicting Garys Mood Gary is cheerful ,C, so-so, S, or glum , G Define: state 0 (C), state 1 (S) and state 2 (G) X n is the state of Garys mood on day n = 5 . 3 . 2 . 3 . 4 . 3 . 1 . 4 . 5 . P 7 Markov Chains Examples: Transforming to a Markov Chain P(Rain Tomorrow| Rain Today, Rain Yesterday}= 0.7 P(Rain Tomorrow| Rain Today, No Rain Yesterday}= 0.5 P(Rain Tomorrow| No Rain Today, Rain Yesterday}= 0.4 P(Rain Tomorrow| No Rain Today, No Rain Yesterday}= 0.2 Define: state 0: rained today and yesterday state 1: rained today but not yesterday state 2: rained yesterday but not today state 3: rained neither today nor yesterday X n is the state of weather 8 Markov Chains Examples: Transforming to a Markov Chain Define: state 0: rained today and yesterday...
View Full Document

Page1 / 36

EMSE 208 Lecture 4 - Markov Chains - 1 Markov Chains...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online