So given data rk we would like 1 to determine the

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: HMM’s: Given an observation r K which is a HMM, we are usually interested in determining 1 the underlying Markov chain S K embedded in this data. In Example 3 above, the 1 HMM represents the digital communication received data, and the Markov chain represents the transmitted symbol information. So, given data rK , we would like 1 to determine the state sequence S K which in some sense best fits this data. 1 The MLSE problem is K {S 1 ,S 2 ,···,S K } max p(rn /S n ) N =1 (14) and the MAP problem is K {S 1 ,S 2 ,···,S K } max p(rn /S n ) P (S n /S n−1 ) · P (S 0 ) . n=1 (15) Note that for this MLSE problem statement S n does not necessarily have to be a Markov chain, but for MAP it does. The negative natural log versions of these problems are K {S 1 ,S 2 ,···,S K } min − N =1 ln{p(r n /S n )} (16) and K {S 1 ,S 2 ,···,S K } min − n=1 (ln{p(r n /S n )} + ln{P (S n /S n−1 )}) − ln{P (S 0 )} . (17) respectively. For the discussion direction below on employing the V...
View Full Document

This document was uploaded on 10/12/2009.

Ask a homework question - tutors are online