mle_comp - Stat 411 – Lecture Notes Supplement...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 411 – Lecture Notes Supplement Computation of maximum likelihood estimators *†‡ Ryan Martin Spring 2012 1 Introduction Suppose that sample data X 1 ,...,X n are iid with common distribution having PMF/PDF f θ ( x ). The goal is to estimate the unknown parameter θ . In simple examples (including those presented in class), there is a closed-form expression for ˆ θ , the maximum likelihood estimator (MLE), the maximizer of the likelihood function L ( θ ) = Q n i =1 f θ ( X i ). But in many practical problems, there may be more than one solution to the likelihood equation ( ∂/∂θ )log L ( θ ) = 0, and there may be no nice formula for those solutions. In such cases, one will need to use some numerical methods to compute the MLE. The purpose of this document is to elaborate on this computational aspect. These concepts are not central to our goal of developing the basics of statistical theory and, as such, will not be tested on. However, these are important throughout all of applied and theoretical statistics. In this note I shall focus one just one optimization strategy, namely Newton’s method. A course on computational statistics would delve deeper into other methods, such as the EM algorithm, simulated annealing, and iteratively reweighted least squares. 2 Newton’s method Suppose the goal is to solve g ( x ) = 0 for some function g . Newton’s method is one useful technique, and the basics are presented in early calculus courses. The idea is based on the fact that, locally, any differentiable function can be suitably approximated by a linear function. This linear function is then used to define a recursive procedure that will, under suitable conditions, eventually find the desired solution. 2.1 Single parameter problems Consider a problem where the unknown parameter θ is a number. An example is where the underlying distribution is Pois ( θ ). The next subsection deals with the case of a vector * Version: February 15, 2012 † Please do not distribute these notes without the author’s consent ( [email protected] ) ‡ These notes are meant solely to supplement in-class lectures. The author makes no guarantees that these notes are free of typos or other, more serious errors. 1 of parameters. Write ‘ ( θ ) for the log-likelihood log L ( θ ). Assume that ‘ ( θ ) is twice differentiable with respect to θ ; this is not really a practical restriction, in fact, our good theoretical properties of MLEs assumes this and more....
View Full Document

This note was uploaded on 03/12/2012 for the course STAT 411 taught by Professor Staff during the Spring '08 term at Ill. Chicago.

Page1 / 5

mle_comp - Stat 411 – Lecture Notes Supplement...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online