ps5 (1)

ps5 (1) - the Baum-Welch algorithm for training a discrete...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Problem Set 5 MAS 622J/1.126J: Pattern Recognition and Analysis Due Monday, 6 November 2006 [Note: All instructions to plot data or write a program should be carried out using either Python accompanied by the matplotlib package or Matlab. Feel free to use either or both, but in order to maintain a reasonable level of consistency and simplicity we ask that you do not use other software tools.] Problem 1: Parameter Learning by Estimation and Max- imization ± ² ± ² ± ²³ 1 3 2 Consider data, D = , , , sampled from a two-dimensionaal (separable) 2 3 distribution, p ( x 1 , x 2 ) = p ( x 1 ) p ( x 2 ), with 1 e x 1 1 if x 1 0 1 if 0 x 2 θ 2 p ( x 1 ) = θ 1 p ( x 2 ) = θ 2 0 otherwise 0 otherwise and a missing feature value, . 7 ± ² a Start with an initial estimate, θ 0 = 5 , and analytically calculate the estimate, Q ( θ, θ 0 )—the estimate step of the EM algorithm. b Find the θ that maximizes your Q ( θ, θ 0 )—the maximization step of the EM algorithm. Problem 2: Baum±Welch algorithm and discrete HMMs Download the datasets from the course webpage. The datasets consist of training and testing sequences belonging to two classes. Implement
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: the Baum-Welch algorithm for training a discrete HMM. a Train two fully connected HMMs each with one ∗ hidden node (one HMM for each class of data) and transition probabilities. i Implement the Viterbi algorithm to decode each test sequence using both HMMs. Show the log probability of each test sequence using each HMM. 1 ii Compute the recognition accuracy on the entire test set. iii List the output probabilities and state transition probabilities of each HMM. iv State the threshold you are using and the maximum number of iterations. v Include a complete listing of your source code. b Repeat this problem (a) replacing one ∗ with three . c Repeat this problem (a) replacing one ∗ with fve . 2...
View Full Document

This note was uploaded on 12/04/2011 for the course ESD 1.124 taught by Professor Kevinamaratunga during the Fall '00 term at MIT.

Page1 / 2

ps5 (1) - the Baum-Welch algorithm for training a discrete...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online