This preview shows page 1. Sign up to view the full content.
Unformatted text preview: said about the interpretation of the Marginal
likelihood, or Integrated likelihood or Evidence (note, however,
). See Ref.  We can interpret p (D |H) as being the probability
of our data if we accept our hypothesized model structure H
(namely π (θ, D |H)) and integrate out our parameters
p (D |H) = p (D |θ, H)π (θ|H)d θ. I’ll draw an example of p (D |H1 )
and p (D |H2 ) and discuss Occam’s razor . Nick Jones Inference, Control and Driving of Natural Systems Model Comparison and Bayes Factors Given two models which are equivalent to two diﬀerent hypotheses
about the data, H1 and H2 with respective prior probabilities
p (H1 ) and p (H2 ), we can ask about the relative probabilities of the
data under these hypotheses. This is given by the Bayes Factor.
B= p (D |H1 )
p (D |H2 ) This is a ratio of the marginal likelihoods or model evidence. A
rule of thumb is that if B > 3 then there is substantial evidence
against hypothesis 2 relative to 1. Nick Jones Inference, Control and Driving of Natural Systems Bayesian Protocol for Humans 1 Understand your problem and its context. 2 Formulate an appropriate probabilistic model which allows you
to write down p (D |θ, H) a likelihood for your data D ....
View Full Document
This document was uploaded on 03/01/2014 for the course EE 208 at Imperial College.
- Spring '14