ECON
Econometrics-I-24

# N n i i p b normal n n β β ω β ω β β part 24

• Notes
• 35

This preview shows pages 28–35. Sign up to view the full content.

N N i i p b Normal N N = β β Ω = β β = β

This preview has intentionally blurred sections. Sign up to view the full version.

Part 24: Bayesian Estimation Gibbs Sampling from Posteriors: Ω ™    28/34 1 2 , 1 r,k R 2 r,k 1 ( | , ,..., ) ~ [1 ,1 ] (1/ ) ( ) for each k=1,...,K Draw from inverse gamma for each k: Draw 1+N draws from N[0,1] = h , (1+N ) then the draw is h k N k N k k i k i k r p b Inverse Gamma N NV V N b V = = γ β β + + = β -
Part 24: Bayesian Estimation Gibbs Sampling from Posteriors: i ™    29/34 ( | , ) ( | ) ( , ) M=a constant, L=likelihood, g=prior (This is the definition of the posterior.) Not clear how to sample. Use Metropolis Hastings algorithm. i i i p M L data g β = × β × β b | b

This preview has intentionally blurred sections. Sign up to view the full version.

Part 24: Bayesian Estimation Metropolis – Hastings Method ™    30/34 ,0 ,1 r : an 'old' draw (vector) the 'new' draw (vector) d = , =a constant (see below) the diagonal matrix of standard deviations =a vector or K draws from standard normal i i Define β = β = σ σ r r v v Γ Γ =
Part 24: Bayesian Estimation Metropolis Hastings: A Draw of i ™    31/34 ,1 ,0 ,1 ,0 ,1 ,0 ,1 : ( ) ( ) ( ) a random draw from U(0,1) If U < R, use , During Gibbs iterations, draw controls acceptance rate. Try for i i r i i i i i Trial value d Posterior R Ms cancel Posterior U else keep β = β + β = β = β β β σ % % % .4.

This preview has intentionally blurred sections. Sign up to view the full version.

Part 24: Bayesian Estimation Application: Energy Suppliers p N=361 individuals, 2 to 12 hypothetical suppliers p X= (1) fixed rates, (2) contract length, (3) local (0,1), (4) well known company (0,1), (5) offer TOD rates (0,1), (6) offer seasonal rates (0,1). ™    32/34
Part 24: Bayesian Estimation Estimates: Mean of Individual i MSL Estimate Bayes Posterior  Mean Price -1.04 (0.396) -1.04 (0.0374) Contract -0.208 (0.0240) -0.194 (0.0224) Local 2.40 (0.127) 2.41 (0.140) Well Known 1.74 (0.0927) 1.71 (0.100) TOD -9.94 (0.337) -10.0 (0.315) Seasonal -10.2 (0.333) -10.2 (0.310) ™    33/34

This preview has intentionally blurred sections. Sign up to view the full version.

Part 24: Bayesian Estimation Reconciliation: A Theorem (Bernstein-Von Mises) p The posterior distribution converges to normal with covariance matrix equal to 1/N times the information matrix (same as classical MLE). (The distribution that is converging is the posterior, not the sampling distribution of the estimator of the posterior mean.) p The posterior mean (empirical) converges to the mode of the likelihood function. Same as the MLE. A proper prior disappears asymptotically. p Asymptotic sampling distribution of the posterior mean is the same as that of the MLE.     34/34
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern