Ch10sol - Chapter 10 Asymptotic Evaluations 10.1 First calculate some moments for this distribution E X = θ 3 E X 2 = 1 3 Var X = 1 3 θ 2 9 So 3

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 10 Asymptotic Evaluations 10.1 First calculate some moments for this distribution. E X = θ/ 3 , E X 2 = 1 / 3 , Var X = 1 3- θ 2 9 . So 3 ¯ X n is an unbiased estimator of θ with variance Var(3 ¯ X n ) = 9(Var X ) /n = (3- θ 2 ) /n → 0 as n → ∞ . So by Theorem 10.1.3, 3 ¯ X n is a consistent estimator of θ . 10 . 3 a. The log likelihood is- n 2 log (2 πθ )- 1 2 X ( x i- θ ) /θ. Differentiate and set equal to zero, and a little algebra will show that the MLE is the root of θ 2 + θ- W = 0. The roots of this equation are (- 1 ± √ 1 + 4 W ) / 2, and the MLE is the root with the plus sign, as it has to be nonnegative. b. The second derivative of the log likelihood is (- 2 ∑ x 2 i + nθ ) / (2 θ 3 ), yielding an expected Fisher information of I ( θ ) =- E θ- 2 ∑ X 2 i + nθ 2 θ 3 = 2 nθ + n 2 θ 2 , and by Theorem 10.1.12 the variance of the MLE is 1 /I ( θ ). 10 . 4 a. Write ∑ X i Y i ∑ X 2 i = ∑ X i ( X i + i ) ∑ X 2 i = 1 + ∑ X i i ∑ X 2 i . From normality and independence E X i i = 0 , Var X i i = σ 2 ( μ 2 + τ 2 ) , E X 2 i = μ 2 + τ 2 , Var X 2 i = 2 τ 2 (2 μ 2 + τ 2 ) , and Cov( X i ,X i i ) = 0. Applying the formulas of Example 5.5.27, the asymptotic mean and variance are E ∑ X i Y i ∑ X 2 i ≈ 1 and Var ∑ X i Y i ∑ X 2 i ≈ nσ 2 ( μ 2 + τ 2 ) [ n ( μ 2 + τ 2 )] 2 = σ 2 n ( μ 2 + τ 2 ) b. ∑ Y i ∑ X i = β + ∑ i ∑ X i with approximate mean β and variance σ 2 / ( nμ 2 ). 10-2 Solutions Manual for Statistical Inference c. 1 n X Y i X i = β + 1 n X i X i with approximate mean β and variance σ 2 / ( nμ 2 ). 10 . 5 a. The integral of E T 2 n is unbounded near zero. We have E T 2 n > r n 2 πσ 2 Z 1 1 x 2 e- ( x- μ ) 2 / 2 σ 2 dx > r n 2 πσ 2 K Z 1 1 x 2 dx = ∞ , where K = max ≤ x ≤ 1 e- ( x- μ ) 2 / 2 σ 2 b. If we delete the interval (- δ,δ ), then the integrand is bounded, that is, over the range of integration 1 /x 2 < 1 /δ 2 . c. Assume μ > 0. A similar argument works for μ < 0. Then P (- δ < X < δ ) = P [ √ n (- δ- μ ) < √ n ( X- μ ) < √ n ( δ- μ )] < P [ Z < √ n ( δ- μ )] , where Z ∼ n(0 , 1). For δ < μ , the probability goes to 0 as n → ∞ . 10 . 7 We need to assume that τ ( θ ) is differentiable at θ = θ , the true value of the parameter. Then we apply Theorem 5.5.24 to Theorem 10.1.12. 10 . 9 We will do a more general problem that includes a ) and b ) as special cases. Suppose we want to estimate λ t e- λ /t ! = P ( X = t ). Let T = T ( X 1 ,...,X n ) = 1 if X 1 = t if X 1 6 = t. Then E T = P ( T = 1) = P ( X 1 = t ), so T is an unbiased estimator. Since ∑ X i is a complete sufficient statistic for λ , E( T | ∑ X i ) is UMVUE. The UMVUE is 0 for y = ∑ X i < t , and for y ≥ t , E( T | y ) = P ( X 1 = t | X X i = y ) = P ( X 1 = t, ∑ X i = y ) P ( ∑ X i = y ) = P ( X 1 = t ) P ( ∑ n i =2 X i = y- t ) P ( ∑ X i = y ) = { λ t e- λ /t ! }{ [( n- 1) λ ] y- t e- ( n- 1) λ / ( y- t )! } ( nλ...
View Full Document

This note was uploaded on 04/18/2010 for the course STAT 622 taught by Professor Peruggia,m during the Spring '08 term at Ohio State.

Page1 / 15

Ch10sol - Chapter 10 Asymptotic Evaluations 10.1 First calculate some moments for this distribution E X = θ 3 E X 2 = 1 3 Var X = 1 3 θ 2 9 So 3

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online