PS4 10

# PS4 10 - t that produces an unbiased estimate of . 2. Show...

This preview shows page 1. Sign up to view the full content.

University of Minnesota Dept. of Electrical and Computer Engineering EE 8581 DETECTION AND ESTIMATION THEORY Spring 2010 Problem Set 4 Assigned: February 23, 2010 Due: March 2, 2010 Readings : Read Levy Sections 5.1- 5.2, Chapter 4. Problems : Solve problems 4.10 and 4.14 in Chapter 4 of Levy’s book and the following problem: Problem 3: We discussed in class the notion of a minimal sufficient statistic. Recall that t(r) is a sufficient statistic for the parameter θ if the density of r given t is independent of . t is minimal if it is a function of all other sufficient statistics. 1. We shall say that a sufficient statistic is complete if E( W(t) | ) = 0 for all implies that W(t) =0 for all with probability 1. Show that there is only one function of a complete sufficient statistic
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: t that produces an unbiased estimate of . 2. Show that a complete statistic is minimal. 3. Assume that in a given problem n independent vector samples r m , m = 1, 2, … , n , are available for processing. Each vector has a distribution of the form ? ( ± ² | ³ ) = ´ ( ³ ) µ ( ± ² )exp ⁡ ( ∑ ? ¶ ( ³ ) · ¶ ( ± ² )). ¸ ¶ =1 The functions p i ( ) and t i ( r m ) are scalar functions of θ and r m respectively. Show that the statistic T( r ) = [T 1 ( r ) T 2 ( r ) … T k ( r )] where ¹ ¶ ( ± ) = ∑ · ¶ ( º » ) ¼ ² =1 is sufficient and complete for . Note that the dimension of the statistic is k and is independent of the number n of vector samples available for processing....
View Full Document

## This note was uploaded on 06/04/2010 for the course EE 8581 taught by Professor Staff during the Spring '08 term at Minnesota.

Ask a homework question - tutors are online