# Module3_MonteCarloSTUDENT.pdf - Module 3 Monte Carlo Bailey...

• 32

This preview shows page 1 - 12 out of 32 pages.

Module 3: Monte Carlo Bailey Fosdick | CSU | STAA 575
Gamma-gamma model: More practice I For i = 1 . . . , n , let with β > 0 unknown and α > 0 known. I Consider the prior: What is a hypothetical data example when we might use this sampling model? STAA575 Module 3: Monte Carlo 2 / 32
Gamma-gamma model: More practice Sampling model for the data: Prior for β : STAA575 Module 3: Monte Carlo 3 / 32
Gamma-gamma model: More practice Derive the posterior distribution of β . STAA575 Module 3: Monte Carlo 4 / 32
Gamma-gamma model: Follow-up questions - Is this a conjugate prior? - What is the sufficient statistic for β ? - What is the posterior mean? - What is the posterior variance? - What is a posterior interval for β ? (Or how would you compute it?) STAA575 Module 3: Monte Carlo 5 / 32
Gamma-gamma model: Posterior predictive distribution Compute p ( y | y 1 , ..., y n ) . STAA575 Module 3: Monte Carlo 6 / 32
Introduction to Monte Carlo I Often we will want to summarize various aspects of the posterior distribution or we would like the posterior distribution for a function of our parameters. Pr ( θ A | y 1 , ..., y n ) E [ | θ 1 - θ 2 | , y 1 , .., y n ] Var log θ 1 - θ Obtaining exact values of these quantities can be impossible. . . BUT if we generate random samples of the parameters from the posterior distribution, then these quantities can be approximated to an arbitrary degree of precision . STAA575 Module 3: Monte Carlo 7 / 32
Law of Large Numbers Let g ( θ ) be (almost) any function of a parameter. If θ (1) , ..., θ ( S ) are i.i.d samples from p ( θ | y 1 , ..., y n ) , then STAA575 Module 3: Monte Carlo 8 / 32
Implications If θ (1) , ..., θ ( S ) are i.i.d samples from p ( θ | y 1 , ..., y n ) , θ = 1 S S s =1 θ ( s ) 1 S - 1 S s =1 ( θ ( s ) - θ ) 2 #( θ ( s ) c ) /S STAA575 Module 3: Monte Carlo 9 / 32
Implications If θ (1) , ..., θ ( S ) are i.i.d samples from p ( θ | y 1 , ..., y n ) , median of { θ (1) , ..., θ ( S ) } → α -percentile of { θ (1) , ..., θ ( S ) } → empirical distribution of { θ (1) , ..., θ ( S ) } → STAA575 Module 3: Monte Carlo 10 / 32
Monte Carlo facts Basic idea: Use randomness to solve deterministic problems The more samples you obtain, the better your estimate will be, which is why you can get an estimate to arbitrary precision .