This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approx imation from the underlying random variables. Example : Method of Moments for Exponential Distribution. X i ,i = 1 , 2 ,.. . ,n are iid exponential, with pdf f ( x ; ) = e x I( x > 0) The first moment is then 1 ( ) = 1 . The the method of moments estimator is n = 1 X n Notice this is of the form n = g ( X ) where g : R + 7 R + with g ( x ) = 1 x . Theorem 1 Suppose X n has an asymptotic normal distribution, that is n ( X n ) N (0 , 2 ) is distribution as n . Suppose g is a function that is continuous and also has a derivative g at , and that g ( ) = 0 . Then n ( g ( X n ) g ( ) ) N (0 , ( g ( )) 2 2 ) 1 Remark : The condition g ( ) = 0 is actually only needed so that n ( g ( X n ) g ( ) )  g ( )  N (0 , 1) in distribution as n . Remark : Theorem 1 is called the delta method. Proof (Outline) The first order Taylor approximation of g about the point , and evaluated at the random variable X n is g ( X n ) g ( ) + g ( ) ( X n ) Subtract g ( ) from both sides and multiply by n gives n ( g ( X n ) g ( ) ) g ( ) n ( X n ) N (0 , ( g ( )) 2 2 ) Remark : A more careful study of Taylors formula with remainder is needed to justify all steps in this approximation. For our purposes in this course these details arejustify all steps in this approximation....
View
Full
Document
 Spring '11
 qqqq

Click to edit the document details