lecture 21 - CS 188 Artificial Intelligence Spring 2010...

Info icon This preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
1 CS 188: Artificial Intelligence Spring 2010 Lecture 21: Speech Recognition 4/8/2010 Pieter Abbeel – UC Berkeley Announcements square4 Written 6 due on tonight square4 Project 4 up! square4 Due 4/15 – start early! square4 Course contest update square4 Planning to post by Friday night 2
Image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2 P4: Ghostbusters 2.0 square4 Plot: Pacman's grandfather, Grandpac, learned to hunt ghosts for sport. square4 He was blinded by his power, but could hear the ghosts’ banging and clanging. square4 Transition Model: All ghosts move randomly, but are sometimes biased square4 Emission Model: Pacman knows a “noisy” distance to each ghost 1 3 5 7 9 11 13 15 Noisy distance prob True distance = 8 Today square4 Dynamic Bayes Nets (DBNs) square4 [sometimes called temporal Bayes nets] square4 HMMs: Most likely explanation queries square4 Speech recognition square4 A massive HMM! square4 Details of this section not required square4 Start machine learning 4
Image of page 2
3 Dynamic Bayes Nets (DBNs) square4 We want to track multiple variables over time, using multiple sources of evidence square4 Idea: Repeat a fixed Bayes net structure at each time square4 Variables from time t can condition on those from t-1 square4 Discrete valued dynamic Bayes nets are also HMMs G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b t =1 t =2 G 3 a E 3 a E 3 b G 3 b t =3 Exact Inference in DBNs square4 Variable elimination applies to dynamic Bayes nets square4 Procedure: “unroll” the network for T time steps, then eliminate variables until P(X T |e 1:T ) is computed square4 Online belief updates: Eliminate all variables from the previous time step; store factors for current time only 6 G 1 a E 1 a E 1 b G 1 b G 2 a E 2 a E 2 b G 2 b G 3 a E 3 a E 3 b G 3 b t =1 t =2 t =3
Image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
4 DBN Particle Filters square4 A particle is a complete sample for a time step square4 Initialize : Generate prior samples for the t=1 Bayes net square4 Example particle: G 1 a = (3,3) G 1 b = (5,3) square4 Elapse time : Sample a successor for each particle square4 Example successor: G 2 a = (2,3) G 2 b = (6,3) square4 Observe
Image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern