Lecture 22-2007 - Bayesian Estimation and Confidence...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Bayesian Estimation and Confidence Intervals Lecture XXII I. Bayesian Estimation A. Implicitly in our previous discussions about estimation, we adopted a classical viewpoint. 1. We had some process generating random observations. 2. This random process was a function of fixed, but unknown. 3. We then designed procedures to estimate these unknown parameters based on observed data. B. Specifically, if we assumed that a random process such as students admitted to the University of Florida, generated heights. This height process can be characterized by a normal distribution. 1. We can estimate the parameters of this distribution using maximum likelihood. 2. The likelihood of a particular sample can be expressed as 2 2 2 12 2 1 2 11 , , , exp 2 2 ni n i n L X X X X 3. Our estimates of and 2 are then based on the value of each parameter that maximizes the likelihood of drawing that sample C. Turning this process around slightly, Bayesian analysis assumes that we can make some kind of probability statement about parameters before we start. The sample is then used to update our prior distribution.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

Lecture 22-2007 - Bayesian Estimation and Confidence...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online