{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture 3_Presentation

# Lecture 3_Presentation - Key Observation for Markov Chains...

This preview shows pages 1–6. Sign up to view the full content.

Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Introductory Engineering Stochastic Processes, ORIE 3510 Instructor: Mark E. Lewis, Professor School of Operations Research and Information Engineering Cornell University Disclaimer : Notes are only meant as a lecture supplement not substitute! Instructor: Mark E. Lewis, Professor 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { X n , n 0 } on X is easily obtainable from P , P ( X n = j | X 0 = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2
Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { X n , n 0 } on X is easily obtainable from P , P ( X n = j | X 0 = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix. This means that if you are ever asked to model a process as a Markov chain you need to specify the initial distribution and P . The transient distribution of DTMC { X n , n 0 } on X is easily obtainable from P , P ( X n = j | X 0 = i ) = ( P n ) ij for any i , j X . Two questions (ok three): Is it simple to obtain P ? Can visual techniques help? Can P be used to obtain the short-term and long-term behavior? Instructor: Mark E. Lewis, Professor 2
Key Observation for Markov Chains Transition Diagrams Short-Term and Long-Term (Limiting/Stationary) Behavior Transience and Recurrence Key Observations Key Observation for Markov Chains Remark If you know the initial state, all of the randomness is captured in the one-step transition matrix.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.
• Spring '09
• RESNIK
• Markov chain, Mark E. Lewis, Transition Diagrams Short-Term, Markov Chains Transition, Chains Transition Diagrams, Recurrence Key Observations

{[ snackBarMessage ]}

### Page1 / 61

Lecture 3_Presentation - Key Observation for Markov Chains...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online