13 prof je bilmes ee596awinter 2013dgms lecture 5 jan

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ecursion. Since max q1:T ∈DQ1:T p(¯1:T , q1:T ) = x max q1:T ∈DQ1:T t p(¯t |qt )p(qt |qt−1 ) x (5.12) = max p(xT |qT ) . . . max p(¯2 |q2 )p(q3 |q2 ) max p(¯1 |q1 )p(q2 |q1 ) x x qT q2 q1 (5.13) = max p(xT |qT ) . . . p(x3 |q3 ) max p(q3 |q2 ) p(¯2 |q2 ) max p(q2 |q1 ) (p(¯1 |q1 )) x x qT q2 q1 (5.14) Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 page 5-39 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch HMM (Viterbi) Decoding Why is it called (Viterbi) decoding? Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 page 5-40 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch HMM (Viterbi) Decoding Why is it called (Viterbi) decoding? Source-channel model of communications (from Information Theory) noise p(x|y ) encoder source W source coder channel encoder channel Y n Xn decoder channel decoder p(y ) Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 source decoder receiver ˆ W page 5-40 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch HMM (Viterbi) Decoding Why is it called (Viterbi) decoding? Source-channel model of communications (from Information Theory) noise p(x|y ) encoder source W source coder channel encoder channel Y n Xn decoder channel decoder p(y ) source decoder receiver ˆ W Consider the source being generated by Markov chain, and the “channel” being each symbol corrupted by some channel noise (observation distribution). Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 page 5-40 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch HMM (Viterbi) Decoding Why is it called (Viterbi) decoding? Source-channel model of communications (from Information Theory) noise p(x|y ) encoder source W source coder channel encoder channel Y n Xn decoder source decoder channel decoder p(y ) receiver ˆ W noisey channel Consider the source being generated by Markov chain, and the “channel” being each symbol corrupted by some channel noise (observation distribution). source y5 p(x|y ) receiver x1 x1 y3 x1 y2 x1 y1 Prof. Jeff Bilmes y4 x1 EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 page 5-40 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch Most Probable Explanation We can thus define a modified form of the α-recursion that, rather than uses summation, uses a max operator. Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 page 5-41 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch Most Probable Explanation We can thus define a modified form of the α-recursion that, rather than uses summation, uses a max operator. m αq (t) Prof. Jeff Bilmes m αq (1) = p(¯t |Qt = q ) x = p(¯t |Qt = q ) max p(Qt = q |Qt−1 = x r (5.15) m r)αr (t EE596A/Winter 2013/DGMs – Lecture 5 - Jan 25th, 2013 − 1) (5.16) page 5-41 (of 232) HMMs Trellis Other HMM queries MPE Sampling What HMMs can do Summary Scratch Most Probable Explanation We can thus define a modified form of the α-recursion that, rather than uses summation, uses a max operator. m αq (t) m αq (1) = p(¯t |Qt = q ) x = p(...
View Full Document

This document was uploaded on 04/05/2014.

Ask a homework question - tutors are online