NotesNov17 - The value of that maximizes the likelihood...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
The value of θ that maximizes the likeli- hood function is called the maximum like- lihood estimator (MLE) of the unknown parameter. Intuition : the joint pmf p X 1 ,...,X n ( x 1 ,...,x n ; θ ) = n Y j =1 p X ( x j ; θ ) expresses the probability (likelihood) of ob- taining the observed values X 1 = x 1 ,X 2 = x 2 ,..., X n = x n if the true value of the parameter is θ . The maximum likelihood estimator of the unknown parameter is that value of θ that makes the observed values the most likely.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
In practice it is often easier to maximize the logarithm of the likelihood function, the so- called log-likelihood function L ( x 1 ,...,x n ; θ ) = log p X 1 ,...,X n ( x 1 ,...,x n ; θ ) = n X j =1 log p X ( x j ; θ ) .
Background image of page 2
The unknown parameter θ may be multidi- mensional ( a vector). Example : Estimate both parameters n and p in the Binomial distribution. In this case θ = ( n,p ).
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
: Let X 1 ,...,X n be a sample from a Poisson distribution with an unknown parame- ter λ . Find the MLE of
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/03/2010 for the course OR&IE 3500 taught by Professor Samorodnitsky during the Fall '10 term at Cornell University (Engineering School).

Page1 / 9

NotesNov17 - The value of that maximizes the likelihood...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online