{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Dr. Hackney STA Solutions pg 121

# Dr. Hackney STA Solutions pg 121 - 7-24 Solutions Manual...

This preview shows page 1. Sign up to view the full content.

7-24 Solutions Manual for Statistical Inference c. π ( θ | x ) = n(¯ x, σ 2 /n ). So, substituting into the formula for a normal mgf, we find E e - = e - c ¯ x + σ 2 c 2 / 2 n , and the LINEX posterior loss is E ( L ( θ, a ) | x ) = e c ( a - ¯ x )+ σ 2 c 2 / 2 n - c ( a - ¯ x ) - 1 . Substitute E e - = e - c ¯ x + σ 2 c 2 / 2 n into the formula in part (b) to find the Bayes rule is ¯ x - 2 / 2 n . d. For an estimator ¯ X + b , the LINEX posterior loss (from part (c)) is E ( L ( θ, ¯ x + b ) | x ) = e cb e c 2 σ 2 / 2 n - cb - 1 . For ¯ X the expected loss is e c 2 σ 2 / 2 n - 1, and for the Bayes estimator ( b = - 2 / 2 n ) the expected loss is c 2 σ 2 / 2 n . The marginal distribution of ¯ X is m x ) = 1, so the Bayes risk is infinite for any estimator of the form ¯ X + b . e. For ¯ X + b , the squared error risk is E ( ¯ X + b ) - θ 2 = σ 2 /n + b 2 , so ¯ X is better than the Bayes estimator. The Bayes risk is infinite for both estimators. 7.66 Let S = i X i binomial( n, θ ). a. E ˆ θ 2 = E S 2 n 2 = 1 n 2 E S 2 = 1 n 2 ( (1 - θ ) + (
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Ask a homework question - tutors are online