MIT15_097S12_lec15

641 proposal distributions to guarantee existence of

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: (θ' , θ)p(θ' |y ) (cancel terms) =J (θ' , θ)α(θ' , θ)p(θ' |y ) (1 in disguise) =K (θ' , θ)p(θ' |y ). And that’s the detailed balance equation. 27 We have now proven that the Metropolis-Hastings algorithm simulations will eventually draw from the posterior distribution. However, there are a number of important questions to be addressed. What proposal distribution should we use? How many iterations will it take for the chain to be sufficiently close to the stationary distribution? How will we know when the chain has reached its stationary distribution? We will discuss these important issues after we introduce the Gibbs’ sampler. 6.3 Gibbs’ Sampler The Gibbs’ sampler is a very powerful MCMC sampling technique for the spe­ cial situation when we have access to conditional distributions. It is a special case of the Metropolis-Hastings algorithm that is typically much faster, but can only be used in special cases. Let us express θ ∈ Jd as θ = [θ1 , . . . , θd ]. Suppose that although we are not able to draw directly from p(θ|y ) because of the normalization integral, w...
View Full Document

This note was uploaded on 03/24/2014 for the course MIT 15.097 taught by Professor Cynthiarudin during the Spring '12 term at MIT.

Ask a homework question - tutors are online