This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Stochastic Processes The Markov Property Markov Chains Examples Introductory Engineering Stochastic Processes, ORIE 3510 Instructor: Mark E. Lewis, Associate Professor School of Operations Research and Information Engineering Cornell University Disclaimer : Notes are only meant as a lecture supplement not substitute! 1/ 17 Stochastic Processes The Markov Property Markov Chains Examples Basic Definitions The History and Future of a Stochastic Process Stochastic Processes A stochastic process is a sequence of random variables usually indexed by time. For example, In continuoustime { X t , t ≥ } In discretetime { X n , n ≥ } The value of each random variable usually represents the state of some process. The set of all possible states (for all time) is called the state space . If we follow a realization of a stochastic process for all time, it is called a sample path of the process. Dow Jones Industrial Average (DJIA) graphed over time Daily inventory levels 2/ 17 Stochastic Processes The Markov Property Markov Chains Examples Basic Definitions The History and Future of a Stochastic Process History and Future Before viewing the process, the path is unknown. After viewing some portion of the path, { X 1 , X 2 ,..., X n } or { X s ; s < t } is called the history up until time n or t, respectively. Similarly, ( X n +1 , X n +2 ,... ) or { X s , s > t } would be called the future of the process after time n or t , respectively. 3/ 17 Stochastic Processes The Markov Property Markov Chains Examples Transition Probabilities and Time Homogeneity The Markov Property It is often the case that if we know the current state of the system, to make predictions about the future we do not need to know anything about the past. Definition If { X n , n ≥ } is a stochastic process viewed at times n = 0 , 1 , 2 ,... then the Markov property states that P ( X n +1 = x n +1  X = x , X 1 = x 1 ,..., X n = x n ) = P ( X n +1 = x n +1  X n = x n ) , i.e., given the present state of the process, the future is independent of the past. 4/ 17 Stochastic Processes The Markov Property Markov Chains Examples Transition Probabilities and Time Homogeneity Implications of the Markov Property Note that this implies several other results. E ( X n +1  X n , X n 1 ,..., X ) = E ( X n +1  X n ) P ( X n + k = x n + k  X = x , X 1 = x 1 ,..., X n = x n ) = P ( X n + k = x n + k  X n = x n ) 5/ 17 Stochastic Processes The Markov Property Markov Chains Examples Transition Probabilities and Time Homogeneity Transition Probabilities and Time Homogeneity Definition P ( X n +1 = x n +1  X n = x n ) is called the...
View
Full
Document
This note was uploaded on 10/07/2010 for the course ORIE 3510 taught by Professor Resnik during the Spring '09 term at Cornell.
 Spring '09
 RESNIK

Click to edit the document details