Unformatted text preview: said about the interpretation of the Marginal
likelihood, or Integrated likelihood or Evidence (note, however,
[3]). See Ref. [4] We can interpret p (D H) as being the probability
of our data if we accept our hypothesized model structure H
(namely π (θ, D H)) and integrate out our parameters
p (D H) = p (D θ, H)π (θH)d θ. I’ll draw an example of p (D H1 )
and p (D H2 ) and discuss Occam’s razor [2]. Nick Jones Inference, Control and Driving of Natural Systems Model Comparison and Bayes Factors Given two models which are equivalent to two diﬀerent hypotheses
about the data, H1 and H2 with respective prior probabilities
p (H1 ) and p (H2 ), we can ask about the relative probabilities of the
data under these hypotheses. This is given by the Bayes Factor.
Bayes Factor
B= p (D H1 )
p (D H2 ) This is a ratio of the marginal likelihoods or model evidence. A
rule of thumb is that if B > 3 then there is substantial evidence
against hypothesis 2 relative to 1. Nick Jones Inference, Control and Driving of Natural Systems Bayesian Protocol for Humans 1 Understand your problem and its context. 2 Formulate an appropriate probabilistic model which allows you
to write down p (D θ, H) a likelihood for your data D ....
View
Full Document
 Spring '14
 NickJones
 Bayesian probability, Prior probability, Bayesian inference, Nick Jones

Click to edit the document details