This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 9 9.1 Prior and posterior distributions. (Textbook, Sections 6.1 and 6.2) Assume that the sample X 1 , . . . , X n is i.i.d. with distribution θ that comes from the family { θ : θ ∈ Θ } and we would like to estimate unknown θ . So far we have discussed two methods  method of moments and maximum likelihood estimates. In both methods we tried to find an estimate ˆ θ in the set Θ such that the distribution ˆ θ in some sense best describes the data. We didn’t make any additional assumptions about the nature of the sample and used only the sample to construct the estimate of θ . In the next few lectures we will discuss a different approach to this problem called Bayes estimators. In this approach one would like to incorporate into the estimation process some apriori intuition or theory about the parameter θ . The way one describes this apriori intuition is by considering a distribution on the set of parameters Θ or, in other words, one thinks of parameter θ as a random variable. Let ξ ( θ ) be a p.d.f.) be a p....
View
Full Document
 Spring '09
 DmitryPanchenko
 Statistics, Conditional Probability, Posterior probability, Prior probability, Xn, posterior distribution

Click to edit the document details