Markov_chainsI-beamer - Stochastic Processes Markov Chains...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Stochastic Processes Markov Chains Transition Probabilities Examples The Transient Distribution Examples Key Observation/Basic Concepts Introductory Engineering Stochastic Processes, ORIE 3510 Instructor: Mark E. Lewis, Professor School of Operations Research and Information Engineering Cornell University Disclaimer : Notes are only meant as a lecture supplement not substitute! Instructor: Mark E. Lewis, Professor 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Stochastic Processes Markov Chains Transition Probabilities Examples The Transient Distribution Examples Key Observation/Basic Concepts Basic Definitions The History and Future of a Stochastic Process Stochastic Processes A stochastic process is a sequence of random variables usually indexed by time. For example, In continuous-time { X t , t 0 } In discrete-time { X n , n 0 } we start here . The value of each random variable usually represents the state of some process. The set of all possible states (for all time) is called the state space . If we follow a realization of a stochastic process for all time, it is called a sample path of the process. Dow Jones Industrial Average (DJIA) graphed over time Daily inventory levels Instructor: Mark E. Lewis, Professor 2
Background image of page 2
Stochastic Processes Markov Chains Transition Probabilities Examples The Transient Distribution Examples Key Observation/Basic Concepts Basic Definitions The History and Future of a Stochastic Process History and Future Before viewing the process, the path is unknown. After viewing some portion of the path, { X 1 , X 2 ,..., X n } or { X s ; s < t } is called the history up until time n or t, respectively. Similarly, ( X n +1 , X n +2 ,... ) or { X s , s > t } would be called the future of the process after time n or t , respectively. At time n we have, { X 0 , X 1 ,... | {z } past , X n , X n +1 , X n +2 ,... | {z } future } Instructor: Mark E. Lewis, Professor 3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Stochastic Processes Markov Chains Transition Probabilities Examples The Transient Distribution Examples Key Observation/Basic Concepts The Markov Property Definition The Markov Property Definition If { X n , n 0 } is a stochastic process on the state space X then the Markov property states that P ( X n +1 = x n +1 | X 0 = x 0 , X 1 = x 1 ,..., X n = x n ) = P ( X n +1 = x n +1 | X n = x n ) , for all x 0 , x 1 ,... x n , x n +1 X . That is, given the present state of the process, the future is independent of the past. If we know the current state of the system, to make predictions about the future we do not need to know anything about the past. Instructor: Mark E. Lewis, Professor 4
Background image of page 4
Stochastic Processes Markov Chains Transition Probabilities Examples The Transient Distribution Examples Key Observation/Basic Concepts The Markov Property Definition Markov Chains Definition A discrete-time stochastic process that has the Markov property is called a discrete-time Markov chain (DTMC). Note that this implies several other results.
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/12/2012 for the course ORIE 3510 taught by Professor Resnik during the Spring '09 term at Cornell University (Engineering School).

Page1 / 29

Markov_chainsI-beamer - Stochastic Processes Markov Chains...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online