100BHWS5

100BHWS5 - STAT 100B HWV Solution Problem 1 Prove that...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STAT 100B HWV Solution Problem 1: Prove that least squares is a special case of maximum likelihood, if we assume that the observational errors follow independent normal distributions. A: Assuming that Y i ∼ N ( α + βx i ,σ 2 ). The likelihood is L ( α,β,σ 2 ) = n Y i =1 1 √ 2 πσ 2 exp {- 1 2 σ 2 [ Y i- ( α + βx i )] 2 } = 1 (2 πσ 2 ) n/ 2 exp {- 1 2 σ 2 n X i =1 [ Y i- ( α + βx i )] 2 } . In order to maximize L ( α,β,σ 2 ), we need to minimize RSS ( α,β ) = ∑ n i =1 [ Y i- ( α + βx i )] 2 . This leads to the least squares. Problem 2: Suppose X 1 , ..., X n are independent observations from a mixture of normal distri- butions, λN ( μ 1 ,σ 2 1 ) + (1- λ ) N ( μ ,σ 2 ). Write down the likelihood function, and calculate the partial derivatives of the log-likelihood. Describe the EM algorithm for computing the parameter estimates. A: Let θ = ( λ,μ 1 ,σ 2 1 ,μ ,σ 2 ). Let f 1 ( x ; θ ) = 1 q 2 πσ 2 1 exp {- 1 2 σ 2 1 ( x- μ 1 ) 2 } , f ( x ; θ ) = 1 q 2 πσ 2 exp {- 1 2 σ 2 ( x- μ ) 2 } , and f ( x ; θ ) = λf 1 ( x ; θ ) + (1- λ ) f ( x ; θ ) . The likelihood is L ( θ ) = Q n i =1 f ( X i ; θ ), and the log-likelihood is l ( θ ) = ∑ n i =1 log f ( X i ; θ )....
View Full Document

This note was uploaded on 03/30/2011 for the course STAT 100B taught by Professor Wu during the Winter '11 term at UCLA.

Page1 / 3

100BHWS5 - STAT 100B HWV Solution Problem 1 Prove that...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online