This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 1 The EM Algorithm SinHorng Chen 2006/12/29 2 X incomplete data h model h h ML (maximum likelihood) R observed data sequence h observed data h independent, i.e., h incomplete data q data q + mixture component h HMM q state h 1 2 N X x x x = L n So ML: maxlog ( ) max log ( ) n P X P x = max log ( ) P X ( ) ( ) n n P X P x = 3 1& data h complete . L optimize L * 6 : i+ * NewtonRaphson method h h twostep iterative procedure steps h expectation and maximization 4 Expectation Step: Define an auxiliary function & &&& @ &&&&h 1 1 1 ( , ) ( , )log ( , ) ( is unknown data associated with ) ( , )log[ ( , ) ( )] ( , )log ( , ) lo N old n old n n k n N n old n n n k N n old n n k Q P k x P k x k x P k x P k x P x P k x P k x = = = = = = + 1 1 1 g ( ) ( , ) ( , )log ( , ) log ( ) N n n old n k N N n old n n n k n P x P k x P k x P k x P x = = = = + and old 5 1 1 1 1 1 ( , ) ( , )log ( ,...
View
Full
Document
 Fall '08
 SinHorngChen

Click to edit the document details