Joint probability density functions the random

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: olution: To begin, we get as simple expression for the likelihood ratio as we can: 2 Λ(u) = u 1 √ e− 2b2 b 2π 1 √e a 2π 2 − u2 2a = a − u2 + u22 a u2 1 1 e 2b2 2a = e 2 ( a2 − b2 ) b b The ML rule is to choose H1 when Λ(X ) > 1. Thus, by taking the natural logarithm of both sides 2 1 of this inequality we obtain the rule: If (ln a ) + X ( a2 − b1 ) > 0 choose H1 . Equivalently, after a 2 b 2 bit of algebra, we find that the ML rule selects H1 when ln b X 2 b2 − a2 < a 2 a2 b2 or 2a2 b2 ln(b/a) b2 − a2 < X 2. Thus, the ML rule can be expressed as a threshold test on the magnitude |X | of X : |X | > K declare H1 is true < K declare H0 is true. (3.12) ln(b/a where K = KM L = ab 2b2 −a2 ) . π0 The MAP rule is to choose H1 when Λ(X ) > π1 . After a bit of algebra, we derive the rule that H1 should be chosen when bπ0 X 2 b2 − a2 ln < aπ1 2 a2 b2 Or, equivalently, the MAP rule is given by the magnitude threshold test (3.12) with the threshold /aπ K...
View Full Document

This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.

Ask a homework question - tutors are online