Math 34920
Spring term 2006
Bayesian Modelling B
by Peter Green (University of Bristol, P.J.Green@bristol.ac.uk).
modelling complex data structures
conditional independence and graphical models
exchangeability and hierarchical models
Markov chain Mont
Detailed balance
The key idea in most practical MCMC methods is reversibility or
, i.e. a balance in the ow of probability
(represented by
) between the states of the Markov chain.
is in detailed balance with respect to if
We say the distribution
,
for al
If you knew which coin had been selected, this probability would
be either 0.8 (coin A) or 0.2 (coin B) and would again be
of the number of previous heads. However, the coin
is unknown, and this induces a dependency between the outcomes
of the rst 10 toss
How do we sample from the posterior?
In general, we want samples from the joint posterior
distribution
Independent sampling from may be difcult
BUT
sampling from a Markov chain with as its
stationary (equilibrium) distribution turns out to be easier
i
Gibbs sampling, continued
2
(1)
p()
(2)
(0)
1
Sample from
then repeat.
, then sample
from
,
forms a Markov chain with stationary distribution
.
For large , has distribution close to , and more to the
looks like a
point, the
sample
sample from .
77
Exam
Math 34920
Spring term 2006
Bayesian Modelling B
by Peter Green (University of Bristol, P.J.Green@bristol.ac.uk).
modelling complex data structures
conditional independence and graphical models
exchangeability and hierarchical models
Markov chain Mont