lecture 21 - CS 188: Artificial Intelligence Spring 2010...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
1 CS 188: Artificial Intelligence Spring 2010 Lecture 21: Speech Recognition 4/8/2010 Pieter Abbeel – UC Berkeley Announcements s Written 6 due on tonight s Project 4 up! s Due 4/15 – start early! s Course contest update s Planning to post by Friday night 2
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 P4: Ghostbusters 2.0 s Plot: Pacman's grandfather, Grandpac, learned to hunt ghosts for sport. s He was blinded by his power, but could hear the ghosts’ banging and clanging. s Transition Model: All ghosts move randomly, but are sometimes biased s Emission Model: Pacman knows a “noisy” distance to each ghost 1 3 5 7 9 11 13 15 Noisy distance prob True distance = 8 Today s Dynamic Bayes Nets (DBNs) s [sometimes called temporal Bayes nets] s HMMs: Most likely explanation queries s Speech recognition s A massive HMM! s Details of this section not required s Start machine learning 4
Background image of page 2
3 Dynamic Bayes Nets (DBNs) s We want to track multiple variables over time, using multiple sources of evidence s Idea: Repeat a fixed Bayes net structure at each time s Variables from time t can condition on those from t-1 s Discrete valued dynamic Bayes nets are also HMMs G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b t =1 t =2 G 3 a E 3 a E 3 b G 3 b t =3 Exact Inference in DBNs s Variable elimination applies to dynamic Bayes nets s Procedure: “unroll” the network for T time steps, then eliminate variables until P(X T |e 1:T ) is computed s Online belief updates: Eliminate all variables from the previous time step; store factors for current time only 6 G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b G 3 a E 3 a E 3 b G 3 b t =1 t =2 t =3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 DBN Particle Filters s A particle is a complete sample for a time step s Initialize : Generate prior samples for the t=1 Bayes net s Example particle: G 1 a = (3,3) G 1 b = (5,3) s Elapse time : Sample a successor for each particle s Example successor:
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/21/2010 for the course EECS 188 taught by Professor Cs188 during the Spring '01 term at University of California, Berkeley.

Page1 / 17

lecture 21 - CS 188: Artificial Intelligence Spring 2010...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online