intro_Gibbs

intro_Gibbs - Introduction to Gibbs Sampling October 8,...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Introduction to Gibbs Sampling October 8, 2010 Readings: Hoff 6 October 7, 2010 Monte Carlo Sampling I We have seen that Monte Carlo sampling is a useful tool for sampling from prior and posterior distributions I By limiting attention to conjugate prior distributions, all models have had tractable posterior distributions so sampling was not really necessary (although convenient) I What if we want to use a non-conjugate prior distribution? I What if we cannot sample from the joint posterior distribution? Semi-Conjugate Examples Normal sampling model Y i | , iid N( , 1 ) But now assume that is independent of a priori: N( m , 1 / p ) G ( / 2 , s 2 / 2) Posterior Distribution: p ( , | Y ) Y j p ( y i | , ) p ( ) p ( ) Factorization of Joint p ( , | Y ) n / 2 exp- 2 s 2 ( n- 1) / 2- 1 exp(- s 2 ) exp- 2 n ( y- ) 2 exp( p ( - m ) 2 ) = p ( | , Y ) p ( | Y ) = p ( | Y ) p ( | , Y ) First Factorization For | , Y complete the square to show that | , Y N n y + p m n + p , ( n + p )- 1 Can we recognize the marginal distribution for ? No! Second Factorization p ( , | Y ) n / 2 exp- 2 s 2 ( n- 1) / 2- 1 exp(- s 2 ) exp- 2 n ( y- ) 2 exp( p ( - m ) 2 ) Can recognize | , Y G n + 2 , ( n- 1) s 2 + n ( y- ) 2 + s 2 2 Sampling? I Suppose we were given a value of that comes from the marginal posterior distribution (say (1) = y ) I We could then draw a value of from the conditional Gamma distribution given = (1) which would give us a draw from the joint distribution ( (1) , (1) ) I (1) could be viewed as a draw from the marginal distribution (based on the first factorization), so if we now use the conditional distribution of | , Y to draw a new (2) we have another sample from the joint distribution....
View Full Document

Page1 / 15

intro_Gibbs - Introduction to Gibbs Sampling October 8,...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online