Math 34920
Spring term 2006
Bayesian Modelling B
by Peter Green (University of Bristol, [email protected]).
modelling complex data structures
conditional independence and graphical models
exc
Detailed balance
The key idea in most practical MCMC methods is reversibility or
, i.e. a balance in the ow of probability
(represented by
) between the states of the Markov chain.
is in detailed bala
If you knew which coin had been selected, this probability would
be either 0.8 (coin A) or 0.2 (coin B) and would again be
of the number of previous heads. However, the coin
is unknown, and this induc
How do we sample from the posterior?
In general, we want samples from the joint posterior
distribution
Independent sampling from may be difcult
BUT
sampling from a Markov chain with as its
stationa
Gibbs sampling, continued
2
(1)
p()
(2)
(0)
1
Sample from
then repeat.
, then sample
from
,
forms a Markov chain with stationary distribution
.
For large , has distribution close to , and more to th
Math 34920
Spring term 2006
Bayesian Modelling B
by Peter Green (University of Bristol, [email protected]).
modelling complex data structures
conditional independence and graphical models
exc