# Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

• Notes
• PresidentHackerCaribou10582
• 1

This preview shows page 1 out of 1 page.

Stat 312: Lecture 05Maximum Likelihood EstimationMoo K. Chung[email protected]September 14, 20041.(Invariance Principle) Ifˆθis the MLE’s ofparameterθthen the MLE ofh(θ)ish(ˆθ)forsome functionh.Proof(partial). Consider likelihood functionL(θ).ˆθsatisfiesdL(θ)= 0. Letφ=h(θ).Then the likelihood function forφ=h(θ)is given byL(h-1(φ)).Differentiating thelikelihood with respect toφ, we haveL(h-1(φ))=dL(θ)=dL(θ)1h0(θ)= 0.2.Loglikelihood.MaximizingL(θ)is equiva-lent to maximizinglnL(θ)sincelnis an in-creasing function.Example.This technique is best illus-trated by finding the MLE of parameters inN(μ, σ2).ˆμ=¯X,ˆσ2=1nnXi=1(Xi-¯X)2are the MLE ofμandσ2respectively. Notethatˆσ2is not un unbiased estimator ofσ2.3.Asymptotic unbiasness.When the samplesize is large, the maximum likelihood esti-mator ofθis approximately unbiased. TheMLE ofθ
End of preview. Want to read the entire page?

Course Hero member to access this document

Term
Fall
Professor
Chung
Tags
Maximum likelihood, Likelihood function, maximum likelihood estimator
• • • 