lec19 - Lecture 19 In the last lecture we found the Bayes...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 19 In the last lecture we found the Bayes decision rule that minimizes the Bayes eror = k X i =1 ( i ) i = k X i =1 ( i ) i ( 6 = H i ) . Let us write down this decision rule in the case of two simple hypothesis H 1 , H 2 . For simplicity of notations, given the sample X = ( X 1 , . . . , X n ) we will denote the joint p.d.f. by f i ( X ) = f i ( X 1 ) . . . f i ( X n ) . Then in the case of two simple hypotheses the Bayes decision rule that minimizes the Bayes error = (1) 1 ( 6 = H 1 ) + (2) 2 ( 6 = H 2 ) is given by = H 1 : (1) f 1 ( X ) > (2) f 2 ( X ) H 2 : (2) f 2 ( X ) > (1) f 1 ( X ) H 1 or H 2 : (1) f 1 ( X ) = (2) f 2 ( X ) or, equivalently, = H 1 : f 1 ( X ) f 2 ( X ) > (2) (1) H 2 : f 1 ( X ) f 2 ( X ) < (2) (1) H 1 or H 2 : f 1 ( X ) f 2 ( X ) = (2) (1) (19.1) (Here 1 = + , 1 = 0.) This kind of test if called likelihood ratio test since it is expressed in terms of the ratio f 1 ( X ) /f 2 ( X ) of likelihood functions. Example. Suppose we have only one observation X 1 and two simple hypotheses H 1 : = N (0 , 1) and H 2 : = N (1 , 1) . Let us take an apriori distribution given by (1) = 1 2 and (2) = 1 2 , 71 LECTURE 19. 72 i.e. both hypothesis have equal weight, and find a Bayes decision rulei....
View Full Document

This note was uploaded on 10/11/2009 for the course STATISTICS 18.443 taught by Professor Dmitrypanchenko during the Spring '09 term at MIT.

Page1 / 5

lec19 - Lecture 19 In the last lecture we found the Bayes...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online