sta 2006 0708_chapter_12

# sta 2006 0708_chapter_12 - STA 2006 MLE Examples(Section...

This preview shows pages 1–2. Sign up to view the full content.

MLE: Examples (Section 6.14) 2007-2008 Term II 1 Fisher Information Denote - E [ { log( f ( X i ; θ )) } 00 ] by I ( θ ). This quantity is called the Fisher Information . It measures the average curvature of the log-density at the true parameter value. Also, it is the inverse of the asymptotic variance of n ˆ θ . Consider the asymptotic variance of ˆ θ as V ar ( ˆ θ ) 1 n × 1 I ( θ ) . (1) Since ˆ θ is consistent for θ , Equation (1) can be viewed as a measure of the estimation error. In fact, that consists of 2 components: 1. Sample size: 1 /n . The larger the better. 2. Model: If the model contains good amount of ”Information” at the true parameter value, the estimation error should be small because of 1 /I ( θ ). Example 1: Suppose X 1 ,...,X n i.i.d. Bernoulli( p ). Then, f ( X i ; p ) = p X i (1 - p ) 1 - X i log( f ( X i ; p )) = X i log( p ) + (1 - X i ) log(1 - p ) . Therefore, { log( f ( X i ; p )) } 0 = d dp log( f ( X i ; p )) = X i p - 1 - X i 1 - p { log( f ( X i ; p )) } 00 = d 2 dp 2 log( f ( X i ; p )) = - X i p 2 - 1 - X i (1 - p ) 2 and I ( p ) = - E [ { log( f ( X i ; p )) } 00 ] = E [ X i p 2 + 1 - X i (1 - p ) 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 06/05/2011 for the course STATISTICS 2006 taught by Professor Ho during the Spring '11 term at CUHK.

### Page1 / 5

sta 2006 0708_chapter_12 - STA 2006 MLE Examples(Section...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online