This preview shows page 1. Sign up to view the full content.
Unformatted text preview: h transition probability u(i, j ). Then Xt = YN (t)
is a continuous-time Markov chain. In words, Xt takes one jump according to
u(i, j ) at each arrival of N (t).
Why is this true? Intuitively, this follows from the lack of memory property of
the exponential distribution. If Xs = i, then independent of what has happened
in the past, the time to the next jump will be exponentially distributed with
rate and will go to state j with probability u(i, j ).
Discrete time Markov chains were described by giving their transition probabilities p(i, j ) = the probability of jumping from i to j in one step. In continuous
119 120 CHAPTER 4. CONTINUOUS TIME MARKOV CHAINS time there is no ﬁrst time t > 0, so we introduce for each t > 0 a transition
pt (i, j ) = P (Xt = j |X0 = i) To compute this for Example 4.1, we note that N (t) has a Poisson number of
jumps with mean t, so
pt (i, j ) = 1
X e t( n=0 t)n n
u (i, j )
n! where un (i, j ) is the nth power of the transition probability u(i, j )....
View Full Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).
- Spring '10
- The Land