{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ch10sol - Chapter 10 Asymptotic Evaluations 10.1 First...

Info icon This preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Chapter 10 Asymptotic Evaluations 10.1 First calculate some moments for this distribution. E X = θ/ 3 , E X 2 = 1 / 3 , Var X = 1 3 - θ 2 9 . So 3 ¯ X n is an unbiased estimator of θ with variance Var(3 ¯ X n ) = 9(Var X ) /n = (3 - θ 2 ) /n 0 as n → ∞ . So by Theorem 10.1.3, 3 ¯ X n is a consistent estimator of θ . 10 . 3 a. The log likelihood is - n 2 log (2 πθ ) - 1 2 ( x i - θ ) /θ. Differentiate and set equal to zero, and a little algebra will show that the MLE is the root of θ 2 + θ - W = 0. The roots of this equation are ( - 1 ± 1 + 4 W ) / 2, and the MLE is the root with the plus sign, as it has to be nonnegative. b. The second derivative of the log likelihood is ( - 2 x 2 i + ) / (2 θ 3 ), yielding an expected Fisher information of I ( θ ) = - E θ - 2 X 2 i + 2 θ 3 = 2 + n 2 θ 2 , and by Theorem 10.1.12 the variance of the MLE is 1 /I ( θ ). 10 . 4 a. Write X i Y i X 2 i = X i ( X i + i ) X 2 i = 1 + X i i X 2 i . From normality and independence E X i i = 0 , Var X i i = σ 2 ( μ 2 + τ 2 ) , E X 2 i = μ 2 + τ 2 , Var X 2 i = 2 τ 2 (2 μ 2 + τ 2 ) , and Cov( X i , X i i ) = 0. Applying the formulas of Example 5.5.27, the asymptotic mean and variance are E X i Y i X 2 i 1 and Var X i Y i X 2 i 2 ( μ 2 + τ 2 ) [ n ( μ 2 + τ 2 )] 2 = σ 2 n ( μ 2 + τ 2 ) b. Y i X i = β + i X i with approximate mean β and variance σ 2 / ( 2 ).
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
10-2 Solutions Manual for Statistical Inference c. 1 n Y i X i = β + 1 n i X i with approximate mean β and variance σ 2 / ( 2 ). 10 . 5 a. The integral of E T 2 n is unbounded near zero. We have E T 2 n > n 2 πσ 2 1 0 1 x 2 e - ( x - μ ) 2 / 2 σ 2 dx > n 2 πσ 2 K 1 0 1 x 2 dx = , where K = max 0 x 1 e - ( x - μ ) 2 / 2 σ 2 b. If we delete the interval ( - δ, δ ), then the integrand is bounded, that is, over the range of integration 1 /x 2 < 1 2 . c. Assume μ > 0. A similar argument works for μ < 0. Then P ( - δ < X < δ ) = P [ n ( - δ - μ ) < n ( X - μ ) < n ( δ - μ )] < P [ Z < n ( δ - μ )] , where Z n(0 , 1). For δ < μ , the probability goes to 0 as n → ∞ . 10 . 7 We need to assume that τ ( θ ) is differentiable at θ = θ 0 , the true value of the parameter. Then we apply Theorem 5.5.24 to Theorem 10.1.12. 10 . 9 We will do a more general problem that includes a ) and b ) as special cases. Suppose we want to estimate λ t e - λ /t ! = P ( X = t ). Let T = T ( X 1 , . . . , X n ) = 1 if X 1 = t 0 if X 1 = t. Then E T = P ( T = 1) = P ( X 1 = t ), so T is an unbiased estimator. Since X i is a complete sufficient statistic for λ , E( T | X i ) is UMVUE. The UMVUE is 0 for y = X i < t , and for y t , E( T | y ) = P ( X 1 = t | X i = y ) = P ( X 1 = t, X i = y ) P ( X i = y ) = P ( X 1 = t ) P ( n i =2 X i = y - t ) P ( X i = y ) = { λ t e - λ /t ! }{ [( n - 1) λ ] y - t e - ( n - 1) λ / ( y - t )! } ( ) y e - /y ! = y t ( n - 1) y - t n y . a. The best unbiased estimator of e - λ is (( n - 1) /n ) y . b. The best unbiased estimator of λe - λ is ( y/n )[( n - 1) /n ] y - 1 c. Use the fact that for constants a and b , d λ a b λ = b λ λ a - 1 ( a + λ log b ) , to calculate the asymptotic variances of the UMVUEs. We have for t = 0, ARE n - 1 n n ˆ λ , e - λ = e - λ ( n - 1 n ) log ( n - 1 n ) n 2 ,
Image of page 2
Second Edition 10-3 and for t = 1 ARE n n - 1 ˆ λ n - 1 n n ˆ λ , ˆ λe - λ = ( λ - 1) e - λ n n - 1 ( n - 1 n ) 1 + log ( n - 1 n ) n 2 .
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern