This preview shows page 1. Sign up to view the full content.
Unformatted text preview: en the
process will want to leave i immediately, so we will always suppose that each
state i has i < 1. If i = 0, then Xt will never leave i. So suppose i > 0
r(i, j ) = q (i, j )/ i
Here r, short for “routing matrix,” is the probability the chain goes to j when
it leaves i.
Informal construction. If Xt is in a state i with i = 0 then Xt stays there
forever and the construction is done. If i > 0, Xt stays at i for an exponentially
distributed amount of time with rate i , then goes to state j with probability
r(i, j ). 122 CHAPTER 4. CONTINUOUS TIME MARKOV CHAINS Formal construction. Suppose, for simplicity, that i > 0 for all i. Let Yn be
a Markov chain with transition probability r(i, j ). The discrete-time chain Yn ,
gives the road map that the continuous-time process will follow. To determine
how long the process should stay in each state let ⌧0 , ⌧1 , ⌧2 , . . . be independent
exponentials with rate 1.
At time 0 the process is in state Y0 and should stay there for an amount of
time that is exponential with rate (Y0 ), so we let the time the p...
View Full Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.
- Spring '10
- The Land