Lecture 21,22 - Moments & Interval Estimators

Lecture 21,22 - Moments & Interval Estimators - 1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture 21 ORIE3500/5500 Summer2011 Li Comments 1. MLE may or may not be unbiased. But, even if bias is present, it converges to 0 as the sample size increases. 2. MLE is a consistent estimator. 3. If ˆ θ is the MLE of an unknown parameter θ then for any one-one function h , h ( ˆ θ ) is the MLE of h ( θ ). 1 Method of Moments Estimator. The main idea behind this approach is that for any distribution the mo- ments are functions of parameters. These functions can be inverted and the parameters can be obtained as functions of the moments. So one can use the estimates of the moments to estimate the parameters. If there are k un- known parameters in the distribution then one uses k moments to estimate the parameters. Suppose θ 1 ,...,θ k are the unknown parameters and suppose that m i is the i th moment, E ( X i ). So E ( X i ) = m i = m i ( θ 1 ,...,θ k ) . Now if we estimate the i th moment m i by ˆ m i = 1 n n X i =1 X i , the i th sample moment. Then equate the population moment with the sam- ple moment and solve for θ 1 ,...,θ 2 ˆ m i = m i ( θ 1 ,...,θ k ) ,i = 1 ,...,k. The following example will help you in your understanding. Example
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/09/2011 for the course IMSE 0301 taught by Professor Song during the Spring '11 term at HKU.

Page1 / 4

Lecture 21,22 - Moments & Interval Estimators - 1...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online