{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

s4 - Stat 5102 Lecture Slides Deck 4 Charles J Geyer School...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Stat 5102 Lecture Slides Deck 4 Charles J. Geyer School of Statistics University of Minnesota 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Bayesian Inference Now for something completely different. Everything we have done up to now is frequentist statistics. Bayesian statistics is very different. Bayesians don’t do confidence intervals and hypothesis tests. Bayesians don’t use sampling distributions of estimators. Modern Bayesians aren’t even interested in point estimators. So what do they do? Bayesians treat parameters as random variables. To a Bayesian probability is the only way to describe uncertainty. Things not known for certain — like values of parameters — must be described by a probability distribution. 2
Background image of page 2
Bayesian Inference (cont.) Suppose you are uncertain about something. Then your uncer- tainty is described by a probability distribution called your prior distribution . Suppose you obtain some data relevant to that thing. The data changes your uncertainty, which is then described by a new prob- ability distribution called your posterior distribution . The posterior distribution reflects the information both in the prior distribution and the data. Most of Bayesian inference is about how to go from prior to posterior. 3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Bayesian Inference (cont.) The way Bayesians go from prior to posterior is to use the laws of conditional probability, sometimes called in this context Bayes rule or Bayes theorem . Suppose we have a PDF g for the prior distribution of the pa- rameter θ , and suppose we obtain data x whose conditional PDF given θ is f . Then the joint distribution of data and parameters is conditional times marginal f ( x | θ ) g ( θ ) This may look strange because, up to this point in the course, you have been brainwashed in the frequentist paradigm. Here both x and θ are random variables. 4
Background image of page 4
Bayesian Inference (cont.) The correct posterior distribution, according to the Bayesian paradigm, is the conditional distribution of θ given x , which is joint divided by marginal h ( θ | x ) = f ( x | θ ) g ( θ ) R f ( x | θ ) g ( θ ) Often we do not need to do the integral. If we recognize that θ 7→ f ( x | θ ) g ( θ ) is, except for constants, the PDF of a brand name distribution, then that distribution must be the posterior. 5
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Binomial Data, Beta Prior Suppose the prior distribution for p is Beta( α 1 , α 2 ) and the con- ditional distribution of x given p is Bin( n, p ). Then f ( x | p ) = n x p x (1 - p ) n - x g ( p ) = Γ( α 1 + α 2 ) Γ( α 1 )Γ( α 2 ) p α 1 - 1 (1 - p ) α 2 - 1 and f ( x | p ) g ( p ) = n x Γ( α 1 + α 2 ) Γ( α 1 )Γ( α 2 ) · p x + α 1 - 1 (1 - p ) n - x + α 2 - 1 and this, considered as a function of p for fixed x is, except for constants, the PDF of a Beta( x + α 1 , n - x + α 2 ) distribution. So that is the posterior. 6
Background image of page 6
Binomial Data, Beta Prior (cont.) A bit slower, for those for whom that was too fast. If we look up the Beta( α 1 , α 2 ) distribution in the brand name distributions handout, we see the PDF f ( x ) = Γ( α 1 + α 2 ) Γ( α 1 )Γ( α 2 ) x α 1 - 1 (1 - x ) α 2 - 1 0 < x < 1 We want g ( p ). To get that we must change f to g , which is trivial, and x to p , which requires some care. That is how we got g ( p ) = Γ( α 1 + α 2 ) Γ( α 1 )Γ( α 2 ) p α 1 - 1 (1 - p ) α
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}