Handbook 6.1: Introduction to Markov Chains
•
Consider an experiment consisting of several stages, and where at each
stage, there are a fixed number of different outcomes.
•
If the probabilities for the outcomes for the next stage depend
only
on
the current outcome, then the experiment is called a
Markov chain
.
With a Markov chain, we call the outcomes
states
.
Example:
Suppose that it has been found that if it is rainy today, that
there is an 0.8 probability of there being rain tomorrow.
Also, suppose
that it has been found that if it does not rain today, then the probability
of there being no rain tomorrow is 0.6.
This description of the weather is a Markov chain. As described, next
day’s weather depends only on the current day’s weather. Also, this Markov
chain has two states: “Rain”, and “No Rain”.
•
We will use matrices to help us answer questions involving Markov
chains.
•
The probabilities describes in the example above are called
transition
probabilities
, because they are the probabilities of changing from one
state to another.
•
We organize these transition probabilities in a
transition matrix
,
which we call
T
.
The rows of
T
represent the
current
state.
The
columns of
T
represent the
next
state.
•
The order of the states for the rows is the same as the order of the
states for the columns.
•
We then fill in the entries in a very natural way; for example, the entry
in the 2
nd
row, 1
st
column would be the probability of going from State
2 to State 1.
Example:
Construct the transition matrix
T
for the two-state Markov
chain described earlier.
Since there are two states, then
T
will have two rows (one for each state)
and two columns (one for each state). Let’s take the order of the states as
“Rainy”, “Not Rainy”.
T
=
parenleftbigg
R
NR
R
NR
parenrightbigg
1
This
preview
has intentionally blurred sections.
Sign up to view the full version.
Now, we need to fill in the probabilities. The entry in the first row, first
column is the probability of going from a rainy day to another rainy day.

This is the end of the preview.
Sign up
to
access the rest of the document.
- Spring '11
- stephenlang
- Calculus, Markov Chains, Probability, Markov chain
-
Click to edit the document details