Chapter 4
Markov Chains
4.1
Definitions and Examples
The importance of Markov chains comes from two facts: (i) there are a large
number of physical, biological, economic, and social phenomena that can be
described in this way, and (ii) there is a welldeveloped theory that allows us to
do computations. We begin with a famous example, then describe the property
that is the defining feature of Markov chains.
Example 4.1(Gambler’s ruin).Consider a gambling game in which on anyturn you win$1 with probabilityp= 0.4or lose$1 with probability1p= 0.6.Suppose further that you adopt the rule that you quit playing if your fortunereaches$N. Of course, if your fortune reaches$0 the casino makes you stop.LetXnbe the amount of money you have afternplays. I claim that yourfortune,Xnhas the “Markov property.” In words, this means that given thecurrent state, any other information about the past is irrelevant for predictingthe next stateXn+1. To check this, we note that if you are still playing at timen, i.e., your fortuneXn=iwith 0< i < N, then for any possible history ofyour wealthin1, in2, . . . i1, i0P(Xn+1=i+ 1Xn=i, Xn1=in1, . . . X0=i0) = 0.4since to increase your wealth by one unit you have to win your next bet and theoutcome of the previous bets has no useful information for predicting the nextoutcome.Turning now to the formal definition, we say thatXnis a discrete timeMarkov chainwithtransition matrixp(i, j) if for anyj, i, in1, . . . i
92
CHAPTER 4.
MARKOV CHAINS
i
1
, X
0
1

i
N
You've reached the end of your free preview.
Want to read all 37 pages?
 Spring '12
 SERDARBAKAL
 Markov Chains, Probability, Markov chain, Ehrenfest, transition probability