{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# lec9 - Lecture 9 9.1 Prior and posterior...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 9 9.1 Prior and posterior distributions. (Textbook, Sections 6.1 and 6.2) Assume that the sample X 1 , . . . , X n is i.i.d. with distribution θ that comes from the family { θ : θ ∈ Θ } and we would like to estimate unknown θ . So far we have discussed two methods - method of moments and maximum likelihood estimates. In both methods we tried to find an estimate ˆ θ in the set Θ such that the distribution ˆ θ in some sense best describes the data. We didn’t make any additional assumptions about the nature of the sample and used only the sample to construct the estimate of θ . In the next few lectures we will discuss a different approach to this problem called Bayes estimators. In this approach one would like to incorporate into the estimation process some apriori intuition or theory about the parameter θ . The way one describes this apriori intuition is by considering a distribution on the set of parameters Θ or, in other words, one thinks of parameter θ as a random variable. Let ξ ( θ ) be a p.d.f.) be a p....
View Full Document

{[ snackBarMessage ]}

### Page1 / 3

lec9 - Lecture 9 9.1 Prior and posterior...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online