Unformatted text preview: ledge of the system
at hand. We then use these quantities, together with the data, to compute
the posterior. The likelihood, prior, and posterior are all related via Bayes’
rule:
p(y θ)p(θ)
p(y θ)p(θ)
p(θy ) =
=
,
(1)
p(y )
p(y θ' )p(θ' )dθ'
where the second step uses the law of total probability. Unfortunately the
integral in the denominator, called the partition function, is often intractable.
This is what makes Bayesian analysis diﬃcult, and the remainder of the notes
will essentially be methods for avoiding that integral.
Coin Flip Example Part 1. Suppose we have been given data from a se
ries of m coin ﬂips, and we are not sure if the coin is fair or not. We might
assume that the data were generated by a sequence of independent draws
from a Bernoulli distribution, parameterized by θ, which is the probability of
ﬂipping Heads.
But what’s the value of θ? That is, which Bernoulli distribution generated
these data?
We could estimate θ as the proportion of the ﬂips that are H...
View
Full
Document
 Spring '12
 CynthiaRudin

Click to edit the document details