{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

HW07 - STAT210A HW07 Due Tuesday 7.1 In the inverse...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STAT210A HW07 Due: Tuesday, October 20, 2009 7.1. In the inverse binomial sampling procedure, N is a random variable representing the number of trials required to observe x successes in a total of N + x Bernoulli trials (with parameter θ ). (a) Show that the best (minimum variance) unbiased estimator of θ is given by δ * ( N ) = ( x- 1) / ( N + x- 1). (b) Show that the information contained in N about θ is I ( θ ) = ( x- 2 xθ ) / [ θ 2 (1- θ )]. (c) Show that var ( δ * ) > 1 /I ( θ ). 7.2. Consider a scale family 1 θ f ( x/θ ) , θ > 0 where f is some fixed density function. (a) Show that the amount of information that a single observation X contains about θ is given by 1 θ 2 Z yf ( y ) f ( y ) + 1 2 f ( y ) dy. (b) Show that the information X contains about ξ = log θ is independent of θ . 7.3. Given a family { p ( x ; θ ) | θ ∈ Θ } and an estimator δ ( · ) with g ( θ ) = E θ [ δ ( X )], the information bound is B ( θ ) = [ g ( θ )] 2 /I ( θ ). Now suppose that)....
View Full Document

{[ snackBarMessage ]}

Page1 / 3

HW07 - STAT210A HW07 Due Tuesday 7.1 In the inverse...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon bookmark
Ask a homework question - tutors are online