0 then x 1px 1 x 2 here as in most applications

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: e so that it reaches equilibrium. To obtain many samples, we output the state at widely separated times. Of course there is an art of knowing how long is long enough to wait between outputting the state to have independent realizations. If we are interested in the expected value of a particular function then (if the chain is irreducible and the state space is finite) Theorem 1.23 guarantees that n X 1X f (Xm ) ! f (x)⇡ (x) n m=1 x The Metropolis-Hastings algorithm is often used when space is continuous, but that requires a more sophisticated Markov chain theory, so we will use discrete examples to illustrate the method. Example 1.35. Geometric distribution. Suppose ⇡ (x) = ✓x (1 ✓) for x = 0, 1, 2, . . .. To generate the jumps we will use a symmetric random walk q (x, x + 1) = q (x, x 1) = 1/2. Since q is symmetric r(x, y ) = min{1, ⇡ (y )/⇡ (x)}. In this case if x > 0, ⇡ (x 1) > ⇡ (x) and ⇡ (x + 1)/⇡ (x) = ✓ so p(x, x 1) = 1/2 p(x, x + 1) = ✓/2 p(x, x) = (1 ✓)/...
View Full Document

Ask a homework question - tutors are online