383-Fall11-Lec15 - 1 CMPSCI 383 Nov 1, 2011 Inference in...

Info iconThis preview shows pages 1–13. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 CMPSCI 383 Nov 1, 2011 Inference in Bayesian Networks 2 Today ʼ s topics: exact and approximate inference • Exact • Inference with joint probability distributions • Exact inference in Bayesian networks • Inference by enumeration • Complexity of exact inference • Approximate • Inference by stochastic simulation • Simple sampling • Rejection sampling • Markov chain Monte Carlo (MCMC) 3 Inference terminology • Conditional probability table: data structure that lists probabilities of a variable given one or more other variables. • Joint distribution: distribution that is speciFed by a Bayesian network • Inference : produces the probability distribution of one or more variables given one or more other variables. 4 Example: Joint distribution V = Cavity; T = Toothache; C = Catch 5 Example: Home security 6 Compact conditional distributions • Even conditional probability tables can be quite large • Combining functions — that relate the value of the parents to the value of the child — is one way of reducing their size • Example (for discrete variables): Noisy-OR “inhibition probabilities” 7 Additional complexities: Mixed-mode nets • We discussed how to handle discrete variables, but BNs can be used to represent and reason about a variety of variable types 8 Compact conditional distributions • For continuous variables, we can assume some linear functional dependence among the variables. • For example, if Cost depends on Harvest and subsidy, for each value of subsidy... 9 Compact conditional distributions 10 Conditional Independence Node X is conditionally independent of its non- descendants given its parents. 11 Conditional Independence Node X is conditionally independent of all other nodes in the network given its “Markov blanket” (its parents, children, and their parents). 12 Alarm JohnCalls MaryCalls Conditional independence (revisited) • Are JohnCalls and MaryCalls independent? • No, they are not completely independent • Whether they are independent is conditional on the value of Alarm • If the value of Alarm is known, are JohnCalls and MaryCalls independent?...
View Full Document

This note was uploaded on 11/29/2011 for the course COMPSCI 383 taught by Professor Andrewbarto during the Fall '11 term at UMass (Amherst).

Page1 / 55

383-Fall11-Lec15 - 1 CMPSCI 383 Nov 1, 2011 Inference in...

This preview shows document pages 1 - 13. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online