This preview shows page 1. Sign up to view the full content.
Unformatted text preview: e so that
it reaches equilibrium. To obtain many samples, we output the state at widely
separated times. Of course there is an art of knowing how long is long enough
to wait between outputting the state to have independent realizations. If we
are interested in the expected value of a particular function then (if the chain
is irreducible and the state space is ﬁnite) Theorem 1.23 guarantees that
n
X
1X
f (Xm ) !
f (x)⇡ (x)
n m=1
x The MetropolisHastings algorithm is often used when space is continuous,
but that requires a more sophisticated Markov chain theory, so we will use
discrete examples to illustrate the method.
Example 1.35. Geometric distribution. Suppose ⇡ (x) = ✓x (1 ✓) for x =
0, 1, 2, . . .. To generate the jumps we will use a symmetric random walk q (x, x +
1) = q (x, x 1) = 1/2. Since q is symmetric r(x, y ) = min{1, ⇡ (y )/⇡ (x)}. In
this case if x > 0, ⇡ (x 1) > ⇡ (x) and ⇡ (x + 1)/⇡ (x) = ✓ so
p(x, x 1) = 1/2 p(x, x + 1) = ✓/2 p(x, x) = (1 ✓)/...
View Full
Document
 Spring '10
 DURRETT
 The Land

Click to edit the document details