Stat150_Spring08_Markov_cts - Statistics 150: Spring 2007...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Statistics 150: Spring 2007 March 6, 2007 0-1 1 Continuous-Time Markov Chains Consider a continuous-time stochastic process { X t ,t } taking on values in a set of nonnegative integers ( we could take any finite or countable state space, but we will work with the integers for concreteness ). We say that the process { X t , t } is a continuous-time Markov chain if for all s,t , and nonnegative integers i,j,x ( u ) , u s , P { X ( t + s ) = j | X ( s ) = i,X ( u ) = x ( u ) , u < s } = P { X ( t + s ) = j | X ( s ) = i } = P { X ( t ) = j | X (0) = i } 1 If we let i denote the amount of time that the process stays in state i before making a transition into a different state, then P { i > s + t | i > s } = P { i > t } , for all s,t . Hence, the random variable i is memoryless and must thus be exponentially distributed. 2 The above gives us a way of constructing a continuous-time Markov chain. Namely, it is a stochastic processes having the properties that each time it enters state i : (i) the amount of time it spends in that state before making a transition into a different state is exponentially distributed with rate, say, i ; and (ii) when the process leaves state i , it will next enter state j with some probability, call it p ij , where j 6 = i p ij = 1 . 3 A continuous-time Markov chain is said to be regular if, with probability 1, the number of transitions in any finite length of time is finite. We shall, assume from now on that all Markov chains considered are regular. 4 Let q ij be defined by q ij = i p ij , all i 6 = j. It follows that q ij is the rate when in state i that the process makes a transition into state j . Let P ij ( t ) = P { X ( t + s ) = j | X ( s ) = i } . 5 2 Birth and Death Processes A continuous-time Markov chain with states , 1 ,... for which q ij = 0 whenever | i- j | > 1 is called a birth and death process. Let i and i be given by i = q i,i +1 , i = q i,i- 1 . The values { i ,i } and { i ,i 1 } are called respectively the birth rate and the death rate. We see that i = i + i , p i,i +1 = i i + i = 1- p i,i- 1 . 6 Example 2.1. (i) The M/M/s Queue. Suppose that customers arrive at an s-server service station in accordance with a Poisson process having rate . That is, the times between successive arrivals are independent exponential random variables having mean 1 / . Each customer, upon arrival, goes directly into service if any of the servers are free, and if not, then the customer joins the queue (that is, he waits in a line). When a server finishes serving a customer, the customer leaves the system, and the next customer in line, if there are any waiting, enters the service....
View Full Document

This note was uploaded on 05/07/2008 for the course STAT 150 taught by Professor Evans during the Spring '08 term at University of California, Berkeley.

Page1 / 36

Stat150_Spring08_Markov_cts - Statistics 150: Spring 2007...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online