This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Key Observation for Markov Chains Transition Diagrams ShortTerm and LongTerm (Limiting/Stationary) Behavior Transience and Recurrence Accessibility and Communication Decomposing the State Space (Reducible DTMCs) Introductory Engineering Stochastic Processes, ORIE 3510 Instructor: Mark E. Lewis, Associate Professor School of Operations Research and Information Engineering Cornell University Disclaimer : Notes are only meant as a lecture supplement not substitute! 1/ 27 Key Observation for Markov Chains Transition Diagrams ShortTerm and LongTerm (Limiting/Stationary) Behavior Transience and Recurrence Accessibility and Communication Decomposing the State Space (Reducible DTMCs) Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the onestep transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . Of course we have assumed that the random environment does not change from period to period. Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the shortterm and longterm behavior? 2/ 27 Key Observation for Markov Chains Transition Diagrams ShortTerm and LongTerm (Limiting/Stationary) Behavior Transience and Recurrence Accessibility and Communication Decomposing the State Space (Reducible DTMCs) Example 1 – Brand Switching Transition Diagrams Choose a symbol for each state and circle it. For each state draw arrows to every state that can be reached in one step with positive probability Label each arc emanating from a state with the probability that it goes to the state to which the arrow points. For a fixed state i , the probability on the arc pointing to j is equal to p ij . It should be clear that the sum of the probabilities along the arcs emanating from i is equal to ∑ j ∈ S p ij = 1. 3/ 27 Key Observation for Markov Chains Transition Diagrams ShortTerm and LongTerm (Limiting/Stationary) Behavior Transience and Recurrence Accessibility and Communication Decomposing the State Space (Reducible DTMCs) Example 1 – Brand Switching Example 1 – Brand Switching There are 3 major cell phone companies: Jog, American Communications and Cellular (also called AC&C) and Horizon. For ease of notation, label them brands A , B , and C , respectively. A customer that uses brand A switches to brand B ( C ) in the next period with probability . 2 ( . 7) and stays with A otherwise. A customer that uses brand B switches to brand A ( C ) with probability . 2 ( . 4) and stays otherwise. A customer that uses brand C switches to brand A ( B ) with probability . 1 ( . 3) and stays otherwise 4/ 27 Key Observation for Markov Chains Transition Diagrams ShortTerm and LongTerm (Limiting/Stationary) Behavior Transience and Recurrence Accessibility and Communication Decomposing the State Space (Reducible DTMCs) Example 1 – Brand Switching Transition Diagram A B C .7....
View
Full Document
 Spring '09
 RESNIK
 Markov Chains, Markov chain, Transition Diagrams ShortTerm, Markov Chains Transition, Chains Transition Diagrams, Key Observation

Click to edit the document details