This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 9 9.1 Prior and posterior distributions. (Textbook, Sections 6.1 and 6.2) Assume that the sample X 1 , . . . , X n is i.i.d. with distribution that comes from the family { : } and we would like to estimate unknown . So far we have discussed two methods  method of moments and maximum likelihood estimates. In both methods we tried to find an estimate in the set such that the distribution in some sense best describes the data. We didnt make any additional assumptions about the nature of the sample and used only the sample to construct the estimate of . In the next few lectures we will discuss a different approach to this problem called Bayes estimators. In this approach one would like to incorporate into the estimation process some apriori intuition or theory about the parameter . The way one describes this apriori intuition is by considering a distribution on the set of parameters or, in other words, one thinks of parameter as a random variable. Let ( ) be a p.d.f.) be a p....
View
Full
Document
This note was uploaded on 10/11/2009 for the course STATISTICS 18.443 taught by Professor Dmitrypanchenko during the Spring '09 term at MIT.
 Spring '09
 DmitryPanchenko
 Statistics

Click to edit the document details