lec9 - Lecture 9 9.1 Prior and posterior distributions....

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 9 9.1 Prior and posterior distributions. (Textbook, Sections 6.1 and 6.2) Assume that the sample X 1 , . . . , X n is i.i.d. with distribution that comes from the family { : } and we would like to estimate unknown . So far we have discussed two methods - method of moments and maximum likelihood estimates. In both methods we tried to find an estimate in the set such that the distribution in some sense best describes the data. We didnt make any additional assumptions about the nature of the sample and used only the sample to construct the estimate of . In the next few lectures we will discuss a different approach to this problem called Bayes estimators. In this approach one would like to incorporate into the estimation process some apriori intuition or theory about the parameter . The way one describes this apriori intuition is by considering a distribution on the set of parameters or, in other words, one thinks of parameter as a random variable. Let ( ) be a p.d.f.) be a p....
View Full Document

This note was uploaded on 10/11/2009 for the course STATISTICS 18.443 taught by Professor Dmitrypanchenko during the Spring '09 term at MIT.

Page1 / 3

lec9 - Lecture 9 9.1 Prior and posterior distributions....

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online