HW07 - STAT210A HW07 Due: Tuesday, October 20, 2009 7.1. In...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STAT210A HW07 Due: Tuesday, October 20, 2009 7.1. In the inverse binomial sampling procedure, N is a random variable representing the number of trials required to observe x successes in a total of N + x Bernoulli trials (with parameter ). (a) Show that the best (minimum variance) unbiased estimator of is given by * ( N ) = ( x- 1) / ( N + x- 1). (b) Show that the information contained in N about is I ( ) = ( x- 2 x ) / [ 2 (1- )]. (c) Show that var ( * ) > 1 /I ( ). 7.2. Consider a scale family 1 f ( x/ ) , > 0 where f is some fixed density function. (a) Show that the amount of information that a single observation X contains about is given by 1 2 Z yf ( y ) f ( y ) + 1 2 f ( y ) dy. (b) Show that the information X contains about = log is independent of . 7.3. Given a family { p ( x ; ) | } and an estimator ( ) with g ( ) = E [ ( X )], the information bound is B ( ) = [ g ( )] 2 /I ( ). Now suppose that)....
View Full Document

This note was uploaded on 10/17/2009 for the course STAT 210a taught by Professor Staff during the Fall '08 term at University of California, Berkeley.

Page1 / 3

HW07 - STAT210A HW07 Due: Tuesday, October 20, 2009 7.1. In...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online