This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 5. Continuoustime Markov Chains • Many processes one may wish to model occur in continuous time (e.g. disease transmission events, cell phone calls, mechanical component failure times, . . . ). A discretetime approximation may or may not be adequate. • { X ( t ) , t ≥ } is a continuoustime Markov Chain if it is a stochastic process taking values on a finite or countable set, say , 1 , 2 , . . . , with the Markov property that P £ X ( t + s )= j  X ( s )= i, X ( u )= x ( u ) for 0 ≤ u ≤ s / = P £ X ( t + s )= j  X ( s )= i / . • Here we consider homogeneous chains, meaning P [ X ( t + s )= j  X ( s )= i ] = P [ X ( t )= j  X (0)= i ] 1 • Write { X n , n ≥ } for the sequence of states that { X ( t ) } arrives in, and let S n be the corresponding arrival times. Set X A n = S n S n 1 . • The Markov property for { X ( t ) } implies the (discretetime) Markov property for { X n } , thus { X n } is an embedded Markov chain , with transition matrix P = [ P ij ] . • Similarly, the interarrival times ' X A n “ must be conditionally independent given { X n } . Why? • Show that X A n has a memoryless property conditional on X n 1 , P £ X A n > t + s  X A n > s, X n 1 = x / = P £ X A n > t  X n 1 = x / i.e., X A n is conditionally exponentially distributed given X n 1 . 2 . • We conclude that a continuoustime Markov chain is a special case of a semiMarkov process: Construction 1. { X ( t ) , t ≥ } is a continuoustime homogeneous Markov chain if it can be constructed from an embedded chain { X n } with transition matrix P ij , with the duration of a visit to i having Exponential ( ν i ) distribution. • We assume ≤ ν i < ∞ in order to rule out trivial situations with instantaneous visits. 3 • An alternative to Construction 1 is as follows: Construction 2 When X ( t ) arrives in state i , generate random variables having independent exponential distributions, Y j ∼ Exponential ( q ij ) where q ij = ν i P ij for j = i . Choose the next state to be k = arg min j Y j , and the time until the transition (i.e. the visit time in i ) to be min j Y j ....
View
Full Document
 Spring '09
 all
 Markov chain, forward equation

Click to edit the document details