This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Feb. 24, 2003 CHAPTER 2. SUFFICIENCY AND ESTIMATION 2.1 Sucient statistics . Classically, a statistic is a measurable function of the ob servations, say f ( X 1 , . . . , X n ). The concept of statistic differs from that of random variable in that a random variable is defined on a probability space, so that one prob ability measure is singled out, where for statistics we have in mind a family of possible probability measures, R. A. Fisher (1922) called a statistic sucient when no other statistic which can be calculated from the same sample provides any additional information as to the value of the parameter to be estimated. For example, given a sample ( X 1 , . . . , X n ) where the X j are i.i.d. N ( , 1), the sample mean X := ( X 1 + X n ) /n turns out to be a sucient statistic for the unknown parameter . J. Neyman (1935) gave one form of a factorization theorem for sucient statistics. This section will give more general forms of the definitions and theorem due to Halmos and L. J. Savage (1949). In general, given a measurable space ( S, B ), that is a set S with a algebra B of subsets, and another measurable space ( Y, F ), a statistic is a measurable function T from S into Y . Often, Y = R or a Euclidean space R d with Borel algebra. Let T 1 ( F ) := { T 1 ( F ) : F F} . Then T 1 ( F ) is a algebra and is the smallest algebra on S for which T is measurable. For any measure on B , L 1 ( ) := L 1 ( S, B , ) denotes the set of all realvalued, Bmeasurable, integrable functions on S . For any probability measure P on ( S, B ), sub algebra A B , and f L 1 ( P ), we have a conditional expectation E P ( f A ), defined as a function g L 1 ( S, A , P ), so that g is measurable for A , such that for all A A , A gdP = A fdP . Such a g always exists, as a RadonNikodym derivative of the signed measure A A f dP with respect to the restriction of P to A (RAP, Theorems 5.5.4, 10.1.1). If f = 1 B for some B B let P ( B A ) := E P (1 B A ). Then almost surely P ( B A ) 1. Conditional expectations for P are only defined up to equality Palmost surely. Now, we will need the fact that suitable Ameasurable functions can be brought outside conditional expectations (RAP, Theorem 10.1.9), much as constants can be brought outside ordinary expectations: 2.1.1 Lemma . E P ( fg A ) = fE P ( g A ) whenever both g and fg are in L 1 ( S, B , P ) and f is Ameasurable. Now, here are precise definitions of suciency: Definition . Given a family P of probability measures on B , a sub algebra A B is called sucient for P if and only if for every B B there is an Ameasurable function f B such that f B = P ( B A ) Palmost surely for every P P . A statistic T from ( S, B ) to ( Y, F ) is called sucient if and only if T 1 ( F ) is sucient....
View
Full
Document
This note was uploaded on 09/19/2011 for the course MATH 111 taught by Professor Jj during the Spring '09 term at AIU Online.
 Spring '09
 JJ
 Statistics

Click to edit the document details