14. Maximum Likelihood Estimators - Regular Case (Jan28,31)

14. Maximum Likelihood Estimators - Regular Case (Jan28,31)...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Statistics 3858 : Maximum Likelihood Estimators - Regular Case Large Sample Theory 1 Regular Case The regular maximum likelihood case is essentially the case where calculus methods apply in order to calculate the MLE. These are written for the case of iid sampling, that is for iid random variables. The statistical model has finite dimensional parameter space Θ and each distribution in the model has pdf (or pmf) f ( · ). The function (with argument x ) log( f ( x ; θ )) ∂θ is called the score function . Smoothness Condition : Regularity Condition 1 Assume for any other θ Θ (except on the boundary) assume that f satisfies in the case of a pdf, in terms of integrals this is an interchange of integration and differentiation R ( ∂f ( x ; θ ) ∂θ ) dx = ∂θ R f ( x ; θ ) dx in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) x ( ∂f ( x ; θ ) ∂θ ) = ∂θ x f ( x ; θ ) End of Assumption 1 This assumption is the smoothness condition alluded to in Theorem 8.5 A in Rice. In the continuous and discrete case the this assumption can be rewritten in terms of interchanging differentiation and integration, or in terms of interchanging differentiation and summation. This assump- tion tells us a very useful property of the score function, in particular its expectation at θ 0 the true value of the parameter , that is E θ 0 ( log( f ( X ; θ )) ∂θ ± ± ± θ = θ 0 = E θ 0 ( log( f ( X ; θ 0 )) ∂θ ) = 0 (1) 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 in the case of a pdf, in terms of integrals this is an interchange of integration and differentiation E θ 0 ( log( f ( X ; θ )) ∂θ ) = R ( log( f ( x ; θ )) ∂θ ) f ( x ; θ 0 ) dx = R ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) dx Evaluate this derivative at θ = θ 0 R ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) dx ± ± ± ± θ = θ 0 = R ∂f ( x ; θ 0 ) ∂θ 1 f ( x ; θ 0 ) f ( x ; θ 0 ) dx = R ∂f ( x ; θ 0 ) ∂θ dx = ∂θ R f ( x ; θ ) dx ± ± ± ± θ = θ 0 = 1 ∂θ = 0 in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) E θ 0 ( log( f ( X ; θ )) ∂θ ) = x ( log( f ( x ; θ )) ∂θ ) f ( x ; θ 0 ) = x ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) Evaluate this derivative at θ = θ 0 x ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) ± ± ± ± ± θ = θ 0 = x ∂f ( x ; θ 0 ) ∂θ 1 f ( x ; θ 0 ) f ( x ; θ 0 ) = x ∂f ( x ; θ 0 ) ∂θ = ∂θ x f ( x ; θ ) ± ± ± ± ± θ = θ 0 = 1 ∂θ = 0 Smoothness Condition : Regularity Condition 2 Assume for any other θ Θ (except on the boundary) assume that f satisfies Assumption 1 and in the case of a pdf, in terms of integrals this is an interchange of integration and differentiation ∂θ R ( log( f ( x ; θ )) ∂θ f ( x ; θ ) ) dx = R ∂θ {( log( f ( x ; θ )) ∂θ f ( x ; θ ) )} dx in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) ∂θ x ( log( f ( x ; θ )) ∂θ f ( x ; θ
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 11

14. Maximum Likelihood Estimators - Regular Case (Jan28,31)...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online