4106-08-Notes-MCI

4106-08-Notes-MCI - Copyright c 2006 by Karl Sigman 1 1.1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Copyright c ± 2006 by Karl Sigman 1 Discrete-time Markov chains 1.1 Stochastic processes in discrete time A stochastic process in discrete time n IN = { 0 , 1 , 2 ,... } is a sequence of random variables (rvs) X 0 ,X 1 ,X 2 ,... denoted by X = { X n : n 0 } (or just X = { X n } ). We refer to the value X n as the state of the process at time n , with X 0 denoting the initial state. If the random variables take values in a discrete space such as the integers ZZ = { ..., - 2 , - 1 , 0 , 1 , 2 ,... } (or some subset of them), then the stochastic process is said to be discrete-valued; we then denote the states by i,j and so on. In general, however, the collection of possible values that the X n can take on is called the state space , is denoted by S and could be, for example, d - dimensional Euclidean space IR d , d 1, or a subset of it. Stochastic processes are meant to model the evolution over time of real phenomina for which randomness is inherent. For example, X n could denote the price of a stock n days from now, the population size of a given species after n years, the amount of bandwidth in use in a telecommunications network after n hours of operation, or the amount of money that an insurance risk company has right after it pays out its n th claim. The insurance risk example illustrates how “time” n need not really be time, but instead can be a sequential indexing of some kind of events. Other such examples: X n denotes the amount of water in a reservoir after the n th rain storm of the year, X n denotes the amount of time that the n th phone caller to an airline has to wait until speaking to a salesperson, or X n denotes the outcome (heads or tails) of the n th flip of a coin. The main challenge in the stochastic modeling of something is in choosing a model that has – on the one hand – enough complexity to capture the complexity of the phenomina in question, but has – on the other hand – enough structure and simplicity to allow one to com- pute things of interest. In the context of our examples given above, we may be interested in computing P ( X 30 > 50) for a stock that we bought for X 0 = $35 per share, or computing the probability that the insurance risk company eventually gets ruined (runs out of money), P ( X n < 0 , for some n > 0), or computing the long-run average waiting time of a caller to the airline lim N →∞ 1 N N X n =1 X n . As a very simple example, for the sequential tossing of a “fair” coin, we would naturally assume that each X n is a 0 . 5 Bernoulli rv, P ( X n = 0) = P ( X n = 1) = 0 . 5, and that the sequence of rvs are independent. This then yields an example of an independent and identically distributed (iid) sequence of rvs. Such sequences are easy to deal with for they are defined by a single distribution (in this case Bernoulli), and lend themselves directly to powerful theorems in probability such as the strong law of large numbers and the central limit theorem. For the other examples given above, however, an iid sequence would not capture enough
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/20/2010 for the course IEOR 4106 taught by Professor Whitt during the Fall '08 term at Columbia.

Page1 / 7

4106-08-Notes-MCI - Copyright c 2006 by Karl Sigman 1 1.1...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online