This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Stats C180 / C236 Introduction to Bayesian Statistics Juana Sanchez UCLA Department of Statistics January 14, 2010 (Revised April 30, 2010) 78 Chapter 9 Gibbs Sampling for the Normal Model. Lab to turn in at the end of session. Must print before leaving. Do problems 8.2.1 and 8.2.2 and 8.2.3 9.1 Introduction We introduced in the last notes the Gibbs sampler for a simple case, a bivariate posterior distribution for ( 1 , 2 ). In this lesson, we will try to reproduce the posterior distributions presented in Ho ff s book, page 94. The program here is slightly di ff erent, and we also do the plots di ff erently. The results, however, are the same. Remember, Gibbs sampling approximates the posterior distribution P ( 1 , 2 ) by drawing random numbers itera- tively from the conditionals p ( 1 | 2 ) and p ( 2 | 1 ). The distributions of the 1 and the 2 generated this way are the posterior marginal distributions of 1 and of 2 , respectively. 9.2 Obtaining the posterior distribution of ( , 2 | y 1 , ......, y n ) in the normal model with unknown mean and variance The joint posterior distribution of ( , 2 ) | y 1 , ...., y n can be obtained by drawing from these two full conditional distributions: The conditional distribution of given the 2 and the data (see chapter 5 of Ho ff s and your class notes for this one): | 2 , y 1 , ....., y n ) Normal ( n , 2 n ) with n = / 2 + n y / 2 1 / 2 + n / sigma 2 and 2 n = 1 / 2 + n / 2- 1 The conditional distribution of 2...
View Full Document
This note was uploaded on 11/24/2010 for the course STAT 201a taught by Professor Wu during the Spring '10 term at Pasadena City College.
- Spring '10