{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

lecture 16

# lecture 16 - CS 188 Artificial Intelligence Spring 2010...

This preview shows pages 1–3. Sign up to view the full content.

1 CS 188: Artificial Intelligence Spring 2010 Lecture 16: Bayes’ Nets III – Inference 3/11/2010 Pieter Abbeel – UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements square4 Current readings square4 Require login square4 Assignments square4 W3 back today in lecture square4 W4 due tonight square4 Midterm square4 3/18, 6-9pm, 0010 Evans --- no lecture on 3/18 square4 We will be posting practice midterms square4 One page note sheet, non-programmable calculators square4 Topics go through today, not next Tuesday 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Bayes’ Net Semantics square4 Let’s formalize the semantics of a Bayes’ net square4 A set of nodes, one per variable X square4 A directed, acyclic graph square4 A conditional distribution for each node square4 A collection of distributions over X, one for each combination of parents’ values square4 CPT: conditional probability table square4 Description of a noisy “causal” process A 1 X A n A Bayes net = Topology (graph) + Local Conditional Probabilities 4 Probabilities in BNs square4 For all joint distributions, we have (chain rule): square4 Bayes’ nets implicitly encode joint distributions square4 As a product of local conditional distributions square4 To see what probability a BN gives to a full assignment, multiply all the relevant conditionals together: square4 This lets us reconstruct any entry of the full joint square4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 11

lecture 16 - CS 188 Artificial Intelligence Spring 2010...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online