{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

14. Maximum Likelihood Estimators - Regular Case (Jan28,31)

# 14. Maximum Likelihood Estimators - Regular Case (Jan28,31)...

This preview shows pages 1–3. Sign up to view the full content.

Statistics 3858 : Maximum Likelihood Estimators - Regular Case Large Sample Theory 1 Regular Case The regular maximum likelihood case is essentially the case where calculus methods apply in order to calculate the MLE. These are written for the case of iid sampling, that is for iid random variables. The statistical model has ﬁnite dimensional parameter space Θ and each distribution in the model has pdf (or pmf) f ( · ). The function (with argument x ) log( f ( x ; θ )) ∂θ is called the score function . Smoothness Condition : Regularity Condition 1 Assume for any other θ Θ (except on the boundary) assume that f satisﬁes in the case of a pdf, in terms of integrals this is an interchange of integration and diﬀerentiation R ( ∂f ( x ; θ ) ∂θ ) dx = ∂θ R f ( x ; θ ) dx in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) x ( ∂f ( x ; θ ) ∂θ ) = ∂θ x f ( x ; θ ) End of Assumption 1 This assumption is the smoothness condition alluded to in Theorem 8.5 A in Rice. In the continuous and discrete case the this assumption can be rewritten in terms of interchanging diﬀerentiation and integration, or in terms of interchanging diﬀerentiation and summation. This assump- tion tells us a very useful property of the score function, in particular its expectation at θ 0 the true value of the parameter , that is E θ 0 ( log( f ( X ; θ )) ∂θ ± ± ± θ = θ 0 = E θ 0 ( log( f ( X ; θ 0 )) ∂θ ) = 0 (1) 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 in the case of a pdf, in terms of integrals this is an interchange of integration and diﬀerentiation E θ 0 ( log( f ( X ; θ )) ∂θ ) = R ( log( f ( x ; θ )) ∂θ ) f ( x ; θ 0 ) dx = R ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) dx Evaluate this derivative at θ = θ 0 R ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) dx ± ± ± ± θ = θ 0 = R ∂f ( x ; θ 0 ) ∂θ 1 f ( x ; θ 0 ) f ( x ; θ 0 ) dx = R ∂f ( x ; θ 0 ) ∂θ dx = ∂θ R f ( x ; θ ) dx ± ± ± ± θ = θ 0 = 1 ∂θ = 0 in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) E θ 0 ( log( f ( X ; θ )) ∂θ ) = x ( log( f ( x ; θ )) ∂θ ) f ( x ; θ 0 ) = x ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) Evaluate this derivative at θ = θ 0 x ∂f ( x ; θ ) ∂θ 1 f ( x ; θ ) f ( x ; θ 0 ) ± ± ± ± ± θ = θ 0 = x ∂f ( x ; θ 0 ) ∂θ 1 f ( x ; θ 0 ) f ( x ; θ 0 ) = x ∂f ( x ; θ 0 ) ∂θ = ∂θ x f ( x ; θ ) ± ± ± ± ± θ = θ 0 = 1 ∂θ = 0 Smoothness Condition : Regularity Condition 2 Assume for any other θ Θ (except on the boundary) assume that f satisﬁes Assumption 1 and in the case of a pdf, in terms of integrals this is an interchange of integration and diﬀerentiation ∂θ R ( log( f ( x ; θ )) ∂θ f ( x ; θ ) ) dx = R ∂θ {( log( f ( x ; θ )) ∂θ f ( x ; θ ) )} dx in the pmf case, interpreting P ( X = x ; θ ) = f ( x ; θ ) ∂θ x ( log( f ( x ; θ )) ∂θ f ( x ; θ
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 11

14. Maximum Likelihood Estimators - Regular Case (Jan28,31)...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online