jurafsky&martin_3rdEd_17 (1).pdf

O t the state observation likelihood of the

Info icon This preview shows pages 129–132. Sign up to view the full content.

View Full Document Right Arrow Icon
( o t ) the state observation likelihood of the observation symbol o t given the current state j
Image of page 129

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
130 C HAPTER 9 H IDDEN M ARKOV M ODELS start H C H C H C end P(C|start) * P(3|C) .2 * .1 P(H|H) * P(1|H) .6 * .2 P(C|C) * P(1|C) .5 * .5 P(C|H) * P(1|C) .3 * .5 P(H|C) * P(1|H) .4 * .2 P(H|start)*P(3|H) .8 * .4 α 1 (2) =.32 α 1 (1) = .02 α 2 (2) = .32*.12 + .02*.08 = .040 α 2 (1) = .32*.15 + .02*.25 = .053 start start start t C H end end end q F q 2 q 1 q 0 o 1 3 o 2 o 3 1 3 Figure 9.7 The forward trellis for computing the total observation likelihood for the ice-cream events 3 1 3 . Hidden states are in circles, observations in squares. White (unfilled) circles indicate illegal transitions. The figure shows the computation of a t ( j ) for two states at two time steps. The computation in each cell follows Eq. 9.14 : a t ( j ) = P N i = 1 a t - 1 ( i ) a i j b j ( o t ) . The resulting probability expressed in each cell is Eq. 9.13 : a t ( j ) = P ( o 1 , o 2 ... o t , q t = j | l ) . Consider the computation in Fig. 9.7 of a 2 ( 2 ) , the forward probability of being at time step 2 in state 2 having generated the partial observation 3 1 . We compute by ex- tending the a probabilities from time step 1, via two paths, each extension consisting of the three factors above: a 1 ( 1 ) P ( H | H ) P ( 1 | H ) and a 1 ( 2 ) P ( H | C ) P ( 1 | H ) . Figure 9.8 shows another visualization of this induction step for computing the value in one new cell of the trellis. We give two formal definitions of the forward algorithm: the pseudocode in Fig. 9.9 and a statement of the definitional recursion here. 1. Initialization: a 1 ( j ) = a 0 j b j ( o 1 ) 1 j N (9.15) 2. Recursion (since states 0 and F are non-emitting): a t ( j ) = N X i = 1 a t - 1 ( i ) a i j b j ( o t ) ; 1 j N , 1 < t T (9.16) 3. Termination: P ( O | l ) = a T ( q F ) = N X i = 1 a T ( i ) a iF (9.17)
Image of page 130
9.4 D ECODING : T HE V ITERBI A LGORITHM 131 o t-1 o t a 1j a 2j a Nj a 3j b j (o t ) α t (j)= Σ i α t-1 (i) a ij b j (o t ) q 1 q 2 q 3 q N q 1 q j q 2 q 1 q 2 o t+1 o t-2 q 1 q 2 q 3 q 3 q N q N α t-1 (N) α t-1 (3) α t-1 (2) α t-1 (1) α t-2 (N) α t-2 (3) α t-2 (2) α t-2 (1) Figure 9.8 Visualizing the computation of a single element a t ( i ) in the trellis by summing all the previous values a t - 1 , weighted by their transition probabilities a , and multiplying by the observation probability b i ( o t + 1 ) . For many applications of HMMs, many of the transition probabilities are 0, so not all previous states will contribute to the forward probability of the current state. Hidden states are in circles, observations in squares. Shaded nodes are included in the probability computation for a t ( i ) . Start and end states are not shown. function F ORWARD ( observations of len T , state-graph of len N ) returns forward-prob create a probability matrix forward[N+2,T] for each state s from 1 to N do ; initialization step forward [ s ,1] a 0 , s b s ( o 1 ) for each time step t from 2 to T do ; recursion step for each state s from 1 to N do forward [ s , t ] N X s 0 = 1 forward [ s 0 , t - 1 ] a s 0 , s b s ( o t ) forward [ q F ,T] N X s = 1 forward [ s , T ] a s , q F ; termination step return forward [ q F , T ] Figure 9.9 The forward algorithm. We’ve used the notation forward [ s , t ] to represent a t ( s ) .
Image of page 131

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 132
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern