mvue - Stat 411 Lecture Notes Supplement RaoBlackwell and...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 411 Lecture Notes Supplement RaoBlackwell and LehmannScheffe theorems * Ryan Martin Spring 2012 1 Introduction Unbiased estimation is a fundamental development in the theory of statistical inference. Nowadays there is considerably less emphasis on unbiasedness in statistical theory and practice, particularly because there are other more pressing concerns in modern high- dimensional problems (e.g., regularization). Nevertheless, it is important for students of statistics to learn and appreciate these classical developments. In this note, Ill elaborate a bit more on the RaoBlackwell and LehmannScheffe theorems presented in class. In particular, Ill give proofs of both results (the proof of the RaoBlackwell theorem given below is different from that given in class), along with some further comments. The majority of the material presented here is taken from Chapter 2 of Lehmann and Casella, Theory of Point Estimation , Springer 1998. Herein I shall assume, as usual, that X 1 ,...,X n are iid with common PDF/PMF f ( x ), where is an unknown parameter to be estimated. For the presentation here, I shall assume that is a real-valued quantity; however, with some modifications, things can be extended to the more general vector-valued case. The focus is on producing unbiased estimators of , specifically, unbiased estimators with small variance. For that reason, I shall also assume throughout that all estimators in question have finite variance; if there is at least one with finite variance, then theres clearly no need to consider the one without finite variance; if theres no unbiased estimators with finite variance, then theres really nothing for us to do, right? 2 Minimum variance unbiased estimation As I discussed earlier in the course, unbiasedness is a good property for an estimator to have; however, I gave several examples which showed that unbiasedness does not necessarily make an estimator good. But if one insists on an unbiased estimator of , * Version: February 28, 2012 Please do not distribute these notes without the authors consent ( rgmartin@math.uic.edu ) These notes are meant solely to supplement in-class lectures. The author makes no guarantees that these notes are free of typos or other, more serious errors. 1 then it is natural to seek out the best among them. Here best means the one with the smallest variance. That is, the goal is to find the minimum variance unbiased estimator (MVUE) of . After a brief discussion of MVUE uniqueness, I elaborate on the two main theorems presented in class which are often used in tandem to find the unique MVUE....
View Full Document

This note was uploaded on 03/12/2012 for the course STAT 411 taught by Professor Staff during the Spring '08 term at Ill. Chicago.

Page1 / 4

mvue - Stat 411 Lecture Notes Supplement RaoBlackwell and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online