Lecture 3_Presentation - Key Observation for Markov Chains...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Introductory Engineering Stochastic Processes, ORIE 3510 Instructor: Mark E. Lewis, Professor School of Operations Research and Information Engineering Cornell University Disclaimer : Notes are only meant as a lecture supplement not substitute! Instructor: Mark E. Lewis, Professor 1 Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { Xn , n } on X is easily obtainable from P , P ( Xn = j | X = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2 Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { Xn , n } on X is easily obtainable from P , P ( Xn = j | X = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2 Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { Xn , n } on X is easily obtainable from P , P ( Xn = j | X = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2 Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix....
View Full Document

This note was uploaded on 02/12/2012 for the course ORIE 3510 taught by Professor Resnik during the Spring '09 term at Cornell University (Engineering School).

Page1 / 61

Lecture 3_Presentation - Key Observation for Markov Chains...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online