±
±
±
±
±
±
±
±
±
±
±
Markov Chains
Sometimes we are interested in how a random variable changes over time. For example, we
may want to know how the price of a share of stock or a ±rm’s market share evolves. The study
of how a random variable changes over time includes stochastic processes, which are ex-
plained in this chapter. In particular, we focus on a type of stochastic process known as a
Markov chain. Markov chains have been applied in areas such as education, marketing, health
services, ±nance, accounting, and production. We begin by de±ning the concept of a sto-
chastic process. In the rest of the chapter, we will discuss the basic ideas needed for an un-
derstanding of Markov chains.
17.1
What Is a Stochastic Process?
Suppose we observe some characteristic of a system at discrete points in time (labeled 0,
1, 2, .
. .). Let
X
t
be the value of the system characteristic at time
t
. In most situations,
X
t
is not known with certainty before time
t
and may be viewed as a random variable. A
discrete-time stochastic process
is simply a description of the relation between the random
variables
X
0
,
X
1
,
X
2
,....
Some examples of discrete-time stochastic processes follow.
EXAMPLE
1
At time 0, I have $2. At times 1, 2,.
..,I
play
a
game in which I bet $1. With probabil-
ity
p
, I win the game, and with probability 1
±
p
, I lose the game. My goal is to increase
my capital to $4, and as soon as I do, the game is over. The game is also over if my cap-
ital is reduced to $0. If we deFne
X
t
to be my capital position after the time
t
game (if
any) is played, then
X
0
,
X
1
,...,
X
t
may be viewed as a discrete-time stochastic process.
Note that
X
0
²
2 is a known constant, but
X
1
and later
X
t
’s are random. ±or example,
with probability
p
,
X
1
²
3, and with probability 1
±
p
,
X
1
²
1. Note that if
X
t
²
4, then
X
t
³
1
and all later
X
t
’s will also equal 4. Similarly, if
X
t
²
0, then
X
t
³
1
and all later
X
t
’s
will also equal 0. ±or obvious reasons, this type of situation is called a
gambler’s ruin
problem.
2
An urn contains two unpainted balls at present. We choose a ball at random and ²ip a
coin. If the chosen ball is unpainted and the coin comes up heads, we paint the chosen
unpainted ball red; if the chosen ball is unpainted and the coin comes up tails, we paint
the chosen unpainted ball black. If the ball has already been painted, then (whether heads
or tails has been tossed) we change the color of the ball (from red to black or from black
to red). To model this situation as a stochastic process, we deFne time
t
to be the time af-
Choosing Balls from an Urn
The Gambler’s Ruin