This preview shows page 1. Sign up to view the full content.
Unformatted text preview: re 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs stationarity depends on MC
Therefore, the HMM’s stationarity condition is entirely determined
by the stationarity condition of the underlying hidden Markov chain.
Consider the way in which HMMs are often used:
Long chains Prof. Jeﬀ Bilmes EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs stationarity depends on MC
Therefore, the HMM’s stationarity condition is entirely determined
by the stationarity condition of the underlying hidden Markov chain.
Consider the way in which HMMs are often used:
Long chains
Chains with cycle transition matrices Prof. Jeﬀ Bilmes EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs stationarity depends on MC
Therefore, the HMM’s stationarity condition is entirely determined
by the stationarity condition of the underlying hidden Markov chain.
Consider the way in which HMMs are often used:
Long chains
Chains with cycle transition matrices
Chains with uppertriangular matrices Prof. Jeﬀ Bilmes EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs stationarity depends on MC
Therefore, the HMM’s stationarity condition is entirely determined
by the stationarity condition of the underlying hidden Markov chain.
Consider the way in which HMMs are often used:
Long chains
Chains with cycle transition matrices
Chains with uppertriangular matrices
Chains with strictly lefttoright transitions. Ex: speech recognition Prof. Jeﬀ Bilmes EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ HMMs stationarity depends on MC
Therefore, the HMM’s stationarity condition is entirely determined
by the stationarity condition of the underlying hidden Markov chain.
Consider the way in which HMMs are often used:
Long chains
Chains with cycle transition matrices
Chains with uppertriangular matrices
Chains with strictly lefttoright transitions. Ex: speech recognition Hence, in only rare cases, when HMMs are used, are they stationary
stochastic processes.
Prof. Jeﬀ Bilmes EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 page 462 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ Gaussian Mixture HMM
One of the most widely used HMMs in practice is one where the
observation distributions are Gaussian mixtures, where
p(xq ) = p(xq, c)p(cq ) (4.55) N (xµqm , Σqm ) cmq (4.56) c =
m and where
N (xµ, Σ) = Prof. Jeﬀ Bilmes 1
1
exp − (x − µ) Σ−1 (x − µ)
d/2
2
2π Σ EE596A/Winter 2013/DGMs – Lecture 4  Jan 23rd, 2013 (4.57) page 463 (of 239) HMMs HMMs as GMs Other HMM queries What HMMs can do MPE Summ Gaussian Mixture HMM
One of the most widely used HMMs in practice is one where the
observation distributions are Gaussian mixtures, where
p(xq ) = p(xq, c)p(cq ) (4.55) N (xµqm , Σqm ) cmq (4.56) c =
m and where
1
1
exp...
View
Full
Document
This document was uploaded on 04/05/2014.
 Winter '14

Click to edit the document details