0025 06 06 04 06 06 04 002 04 04 02 099 08 099 001

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 0.6 0.6 0.4 0.6 0.6 0.4 0.02 0.4 0.4 0.2 0.99 0.8 0.99 0.01 0.99 0.01 0.015 0.99 0.01 0.01 0.01 0.005 10 20 30 d 40 50 60 Other examples: very long chains, ladders, fixed-length distributions (histograms), and so on. Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-73 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ State Duration Modeling If we have multiple parallel states in series, all of which share the same observation distribution, we can construct much more interesting (multimodal) distributions. For example, the following left chain will have distribution as shown on the right (a mixture of negative binomial distributions). 0.025 0.6 0.6 0.4 0.6 0.6 0.4 0.02 0.4 0.4 0.2 0.99 0.8 0.99 0.01 0.99 0.01 0.015 0.99 0.01 0.01 0.01 0.005 10 20 30 d 40 50 60 Other examples: very long chains, ladders, fixed-length distributions (histograms), and so on. HMMs can have flexible distributions, cost of extra states. Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-73 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ What HMMs can do - summary so far Observations are not i.i.d., but conditioned on state variables, they are independent. Observations are not “Viterbi i.i.d.” HMMs are a stationary process over p(x1:n ) whenever the underlying hidden Markov chain is a stationary process. Single Gaussian per state HMM: Covariance decays as: cov(Xt , Xt+h ) µi µj (Ah )ij πi − = µi π i h − → µi µj πj πi − ij Prof. Jeff Bilmes µi π i i ij i µi π i i µi π i =0 i EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-74 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ What HMMs can do - summary so far but mutual information (in practice) can apparently extend in time reasonably far (but also decays). Parameter sharing means enormous flexibility in state duration models (e.g., negative binomial, mixtures thereof, fixed histograms). Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-75 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs Generative Accuracy We can view an HMM as an approximate generative distribution of the observation variables, as in ph (x1:T ) ≈ p(x1:T ) Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-76 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs Generative Accuracy We can view an HMM as an approximate generative distribution of the observation variables, as in ph (x1:T ) ≈ p(x1:T ) Given that ph is an approximation, one that is a mixture ph (x1:T ) = ph (x1:T , q1:T ) (4.76) q1:T what can we say about ph and its accuracy? Prof. Jeff Bilmes EE596A/Winter 2013/DGMs – Lecture 4 - Jan 23rd, 2013 page 4-76 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs Generative Accuracy We can view an HMM as an approximate generative distribution of t...
View Full Document

This document was uploaded on 04/05/2014.

Ask a homework question - tutors are online