This preview shows page 1. Sign up to view the full content.
Unformatted text preview: (θj θ1 , . . . , θj −1 , θj−1 , . . . , θd−1 , y ).
+1 Step 3. Until stationary distribution and the desired number of draws are reached,
increment t ← t + 1 and return to Step 2.
28 In each iteration of the Gibbs’ sampler, we sequentially update each compo
nent of θt . We could do that updating in any order, it does not have to be
1, . . . , d.
The Gibbs’ sampler is a special case of MetropolisHastings where the pro
posal distribution is taken to be the conditional posterior distribution. In
fact, it is easy to show (but notationally extremely cumbersome) that when
we use these conditional posterior distributions as proposal distributions in
MetropolisHastings, the probability of accepting any proposed move is 1,
hence in the Gibbs’ sampler we accept every move.
6.4 Practical considerations Now that we have seen the general idea of MCMC algorithms and some theory
behind them, let us dive into some details.
6.4.1 Proposal distributions To guarantee existence of a stationary distribution, all that is required (with
rare exceptions) is for the proposal distribution J (·, ·) to be such that there is
a pos...
View
Full
Document
This note was uploaded on 03/24/2014 for the course MIT 15.097 taught by Professor Cynthiarudin during the Spring '12 term at MIT.
 Spring '12
 CynthiaRudin

Click to edit the document details