Winter 2010
Pstat160a
Handout MC#6
Time reversible Markov Chains
1. Definition
Consider an irreducible, aperiodic (=ergodic) Markov chain
X
n
with transition probability matrix
P
and stationary
distribution
π
. Suppose the chain is in equilibrium distribution (steady state) that is
P
(
X
n
=
i
) =
π
i
. For instance, the
chain starts with stationary distribution that is
P
(
X
0
=
i
) =
π
i
, or the chain is run for a long time. Instead of observing just
which states
X
n
is, observe transitions. Then the probability that you see a transition from
i
to
j
is
π
i
P
ij
(the chain needs
to be in
i
and moves to
j
). A Markov chain is said to be
reversible
(or
time reversible
)
if
(and only if!
) it is as likely to
see a transition
i
to
j
than
j
to
i
that is
X
n
reversible:
π
i
P
ij
=
π
j
P
ji
for all
i, j
∈ S
Detailed balance equations
(1)
Important remarks:
1) This in no
way means that
P
ij
=
P
ji
that is that the chain is equality likely to move from
i
to
j
than
j
to
i
!!!!
2) The result implies that if we can find
π
that satisfies the detailed balance equations (1), the
X
n
is reversible and
π
is the
stationary distribution. When it works (
X
n
reversible), this is a much simpler way to find
π
! Indeed, we can check that (1)
implies that
pi
i
is
π
is the stationary distribution, just take
∑
j
on both sides of (1) and you get the general balance equation
for an ergodic chain (equation (1) in handout MC #5).
The converse if of course not generally true (only for reversible
chains).
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Winter '10
 bonnet
 Markov Chains, Probability, Markov chain, Markov models, πi Pij, πj Pji πi

Click to edit the document details