This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 13 13.1 Minimal jointly sufficient statistics. When it comes to jointly sufficient statistics ( T 1 , . . . , T k ) the total number of them ( k ) is clearly very important and we would like it to be small. If we don’t care about k then we can always find some trivial examples of jointly sufficient statistics. For instance, the entire sample X 1 , . . . , X n is, obviously, always sufficient, but this choice is not interesting. Another trivial example is the order statistics Y 1 ≤ Y 2 ≤ . . . ≤ Y n which are simply the values X 1 , . . . , X n arranged in the increasing order, i.e. Y 1 = min( X 1 , . . . , X n ) ≤ . . . ≤ Y n = max( X 1 , . . . , X n ) . Y 1 , . . . , Y n are jointly sufficient by factorization criterion, since f ( X 1 , . . . , X n  θ ) = f ( X 1  θ ) × . . . × f ( X n  θ ) = f ( Y 1  θ ) × . . . × f ( Y n  θ ) . When we face different choices of jointly sufficient statistics, how to decide which one is better? The following definition seems natural. Definition. (Minimal jointly sufficient statistics.) ( T 1 , . . . , T k ) are minimal jointly sufficient if given any other jointly sufficient statistics ( r 1 , . . . , r m ) we have, T 1 = g 1 ( r 1 , . . . , r m ) , . . . , T k = g k ( r 1 , . . . , r m ) , i.e. T s can be expressed as functions of r s. How to decide whether ( T 1 , . . . , T k ) is minimal? One possible way to do this is through the Maximum Likelihood Estimator as follows....
View
Full Document
 Spring '09
 DmitryPanchenko
 Statistics, Normal Distribution, yn, Xn, jointly sufficient statistics

Click to edit the document details