{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Stat150_Spring08_Markov_cts

# Stat150_Spring08_Markov_cts - Statistics 150 Spring 2007...

This preview shows pages 1–9. Sign up to view the full content.

Statistics 150: Spring 2007 March 6, 2007 0-1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1 Continuous-Time Markov Chains Consider a continuous-time stochastic process { X t , t 0 } taking on values in a set of nonnegative integers ( we could take any finite or countable state space, but we will work with the integers for concreteness ). We say that the process { X t , t 0 } is a continuous-time Markov chain if for all s, t 0 , and nonnegative integers i, j, x ( u ) , 0 u s , P { X ( t + s ) = j | X ( s ) = i, X ( u ) = x ( u ) , 0 u < s } = P { X ( t + s ) = j | X ( s ) = i } = P { X ( t ) = j | X (0) = i } 1
If we let τ i denote the amount of time that the process stays in state i before making a transition into a different state, then P { τ i > s + t | τ i > s } = P { τ i > t } , for all s, t 0 . Hence, the random variable τ i is memoryless and must thus be exponentially distributed. 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The above gives us a way of constructing a continuous-time Markov chain. Namely, it is a stochastic processes having the properties that each time it enters state i : (i) the amount of time it spends in that state before making a transition into a different state is exponentially distributed with rate, say, ν i ; and (ii) when the process leaves state i , it will next enter state j with some probability, call it p ij , where j = i p ij = 1 . 3
A continuous-time Markov chain is said to be regular if, with probability 1, the number of transitions in any finite length of time is finite. We shall, assume from now on that all Markov chains considered are regular. 4

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Let q ij be defined by q ij = ν i p ij , all i = j. It follows that q ij is the rate when in state i that the process makes a transition into state j . Let P ij ( t ) = P { X ( t + s ) = j | X ( s ) = i } . 5
2 Birth and Death Processes A continuous-time Markov chain with states 0 , 1 , . . . for which q ij = 0 whenever | i - j | > 1 is called a birth and death process. Let λ i and μ i be given by λ i = q i,i +1 , μ i = q i,i - 1 . The values { λ i , i 0 } and { μ i , i 1 } are called respectively the birth rate and the death rate. We see that ν i = λ i + μ i , p i,i +1 = λ i λ i + μ i = 1 - p i,i - 1 . 6

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Example 2.1. (i) The M/M/s Queue. Suppose that customers arrive at an s -server service station in accordance with a Poisson process having rate λ . That is, the times between successive arrivals are independent exponential random variables having mean 1 . Each customer, upon arrival, goes directly into service if any of the servers are free, and if not, then the customer joins the queue (that is, he waits in a line). When a server finishes serving a customer, the customer leaves the system, and the next customer in line, if there are any waiting, enters the service.
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 36

Stat150_Spring08_Markov_cts - Statistics 150 Spring 2007...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online