12. Markov Chains (OR Models)

12. Markov Chains (OR Models) - Lecture 12 Discrete-Time...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 12 Discrete-Time Markov Chains Topics State transition matrix Network diagrams Examples: gamblers ruin, brand switching, IRS, craps Transient probabilities Steady-state probabilities Many real-world systems contain uncertainty and evolve over time. Stochastic processes (and Markov chains) are probability models for such systems. Discrete Time Markov Chains Origins : Galton-Watson process When and with what probability will a family name become extinct? A discrete-time stochastic process is a sequence of random variables X , X 1 , X 2 , . . . typically denoted by { X n }. Components of Stochastic Processes The state space of a stochastic process is the set of all values that the X n s can take. (we will be concerned with stochastic processes with a finite # of states ) Time : n = 0, 1, 2, . . . State : v-dimensional vector, s = ( s 1 , s 2 , . . . , s v ) In general, there are m states, s 1 , s 2 , . . . , s m or s , s 1 , . . . , s m-1 Also, X n takes one of m values, so X n s . At time 0 I have X 0 = $2, and each day I make a $1 bet. I win with probability p and lose with probability 1 p . Ill quit if I ever obtain $4 or if I lose all my money. State space is S = { 0, 1, 2, 3, 4 } Let X n = amount of money I have after the bet on day n . Gamblers Ruin If X n = 4, then X n +1 = X n +2 = = 4. If X n = 0, then X n +1 = X n +2 = = 0. 1 3 with probabilty So, 1 with probabilty 1 p X p = - A stochastic process { X n } is called a Markov chain if Pr { X n +1 = j | X = k , . . . , X n-1 = k n-1 , X n = i } = Pr { X n +1 = j | X n = i } transition probabilities for every i , j , k , . . . , k n-1 and for every n . Discrete time means n N = { 0, 1, 2, . . . } The future behavior of the system depends only on the current state i and not on any of the previous states. Markov Chain Definition Pr{ X n +1 = j | X n = i } = Pr{ X 1 = j | X = i } for all n (They dont change over time) We will only consider stationary Markov chains. The one-step transition matrix for a Markov chain with states S = { 0, 1, 2 } is where p ij = Pr{ X 1 = j | X = i } = 22 21 20 12 11 10 02 01 00 p p p p p p p p p P Stationary Transition Probabilities If the state space S = { 0, 1, . . . , m 1} then we have...
View Full Document

This note was uploaded on 12/19/2011 for the course M E 366l taught by Professor Staff during the Spring '08 term at University of Texas at Austin.

Page1 / 43

12. Markov Chains (OR Models) - Lecture 12 Discrete-Time...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online