This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Markov Chains OEM 2009 ESI 6321 Applied Probability Methods in Engineering 2 Stochastic processes The last module of this course deals with describing and analyzing systems under uncertainty In particular, we will study how a random variable changes over time Examples: Price of a stock or portfolio of stocks Inventory level of a good in a warehouse Career path (workforce planning) Number of customers present in a store or bank etc. 3 Stochastic processes We will focus on systems that we observe at discrete points in time E.g., we evaluate the value of a stock portfolio at the end of each trading day Denote the time points by t =0,1,2, Denote the value (at time t ) of the characteristic of the system that we are interested in by X t X t is a random variable The sequence X , X 1 , X 2 , is called a stochastic process 4 Example 1 We are interested in tracking the stock market on a daily basis Let X t denote the value of the Dow Jones index at the end of trading day t We are currently at the end of trading day 0, and observe X = x Can we model/study the relationship between the random variables X t ( t =0,1,2,)? 5 Example 2 You are visiting Las Vegas, and have a gambling budget of $ x You participate in a game in which you repeatedly bet $1 if you win (which happens with probability p ) you receive $2 if you loose (which happens with probability 1 p ) you receive nothing you stop playing as soon as you are broke or have doubled your initial budget Denote your total wealth at time t by X t 6 States and state space The set of values that the random variable X t can take is called the state space of the stochastic process Example 1: [0, ) Example 2: {0,1,2,2 x } We will restrict ourselves to situations in which the state space consists of a finite number of elements only often: S = {1,2,, s } If X t = i we say that the stochastic process is in state i at time t . 7 Dynamics of the stochastic process Our goal is to describe and study the behavior of the sequence of random variables X , X 1 , X 2 , More specifically, suppose we are currently at time t The stochastic process is characterized by how the next observed value, X t +1 , depends on the past observations 8 Markov Chains An important type of stochastic processes is called a Markov chain In a Markov chain, the future value of the random variables only depend on the current value of the random variable, but not on past values In other words: all past observations can be summarized by the current value of the stochastic process if we know the current state of the stochastic process, it is irrelevant to know how that state was reached 9 Markov Chains Example 1: To predict future values of the DJIA, is it sufficient to know the current value of the DJIA or do we need to know some or all past observations?...
View
Full
Document
This note was uploaded on 05/12/2010 for the course ESI 6321 taught by Professor Josephgeunes during the Spring '07 term at University of Florida.
 Spring '07
 JosephGeunes

Click to edit the document details