SP11 cs188 lecture 19 -- naive bayes 6PP

# SP11 cs188 lecture 19 -- naive bayes 6PP - Announcements CS...

This preview shows pages 1–3. Sign up to view the full content.

1 CS 188: Artificial Intelligence Spring 2011 Lecture 19: Dynamic Bayes Nets, Naïve Bayes 4/6/2011 Pieter Abbeel – UC Berkeley Slides adapted from Dan Klein. Announcements § W4 out, due next week Monday § P4 out, due next week Friday § Mid-semester survey 2 Announcements II § Course contest § Regular tournaments. Instructions have been posted! § First week extra credit for top 20, next week top 10, then top 5, then top 3. § First nightly tournament: tentatively Monday night 3 P4: Ghostbusters 2.0 § Plot: Pacman's grandfather, Grandpac, learned to hunt ghosts for sport. § He was blinded by his power, but could hear the ghosts ` banging and clanging. § Transition Model: All ghosts move randomly, but are sometimes biased § Emission Model: Pacman knows a l noisy z distance to each ghost 1 3 5 7 9 11 13 15 Noisy distance prob True distance = 8 Today § Dynamic Bayes Nets (DBNs) § [sometimes called temporal Bayes nets] § Demos: § Localization § Simultaneous Localization And Mapping (SLAM) § Start machine learning 5 Dynamic Bayes Nets (DBNs) § We want to track multiple variables over time, using multiple sources of evidence § Idea: Repeat a fixed Bayes net structure at each time § Variables from time t can condition on those from t-1 § Discrete valued dynamic Bayes nets are also HMMs G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b t =1 t =2 G 3 a E 3 a E 3 b G 3 b t =3

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Exact Inference in DBNs § Variable elimination applies to dynamic Bayes nets § Procedure: l unroll z the network for T time steps, then eliminate variables until P(X T |e 1:T ) is computed § Online belief updates: Eliminate all variables from the previous time step; store factors for current time only 7 G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b G 3 a E 3 a E 3 b G 3 b t =1 t =2 t =3 DBN Particle Filters § A particle is a complete sample for a time step § Initialize : Generate prior samples for the t=1 Bayes net § Example particle: G 1 a = (3,3) G 1 b = (5,3) § Elapse time : Sample a successor for each particle § Example successor: G 2 a = (2,3) G 2 b = (6,3) § Observe : Weight each entire sample by the likelihood of the evidence conditioned on the sample § Likelihood: P( E 1 a | G 1 a ) * P( E 1 b | G 1 b ) § Resample: Select prior samples (tuples of values) in proportion to their likelihood 8 [Demo] DBN Particle Filters § A particle is a complete sample for a time step § Initialize : Generate prior samples for the t=1 Bayes net § Example particle: G 1 a = (3,3) G 1 b = (5,3) § Elapse time : Sample a successor for each particle § Example successor: G 2 a = (2,3) G 2 b = (6,3) § Observe : Weight each entire sample by the likelihood of the evidence conditioned on the sample § Likelihood: P( E 1 a | G 1 a ) * P( E 1 b | G 1 b ) § Resample: Select prior samples (tuples of values) in proportion to their likelihood 9 Trick I to Improve Particle Filtering Performance: Low Variance Resampling § Advantages: § More systematic coverage of space of samples § If all samples have same importance weight, no samples are lost § Lower computational complexity § If no or little noise in transitions model, all
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 08/26/2011 for the course CS 188 taught by Professor Staff during the Spring '08 term at University of California, Berkeley.

### Page1 / 10

SP11 cs188 lecture 19 -- naive bayes 6PP - Announcements CS...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online