10 - The conditional expectation estimator is 1/5 E | X =...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
The conditional expectation estimator is E ± Θ | X =(30 , 25 , 15 , 40 , 20) ² = R 1 / 5 0 θ 7 e 130 θ R 1 / 5 0 ( θ 0 ) 6 e 130 θ 0 0 . Solution to Problem 8.4. (a) Let X denote the random variable representing the number of questions answered correctly. For each value θ ∈{ θ 1 2 3 } ,wehaveus ing Bayes’ rule, p Θ | X ( θ | k )= p Θ ( θ ) p X | Θ ( k | θ ) 3 i =1 p Θ ( θ i ) p X | Θ ( k | θ i ) . The conditional PMF p X | Θ is binomial with n = 10 and probability of success p i equal to the probability of answer correctly a question, given that the student is of category i ,i .e . , p i = θ i +(1 θ i ) · 1 3 = 2 θ i +1 3 . Thus we have p 1 = 1 . 6 3 ,p 2 = 2 . 4 3 3 = 2 . 9 3 . Forag ivennumberofcorrectanswers k , the MAP rule selects the category i for which the corresponding binomial probability ( 10 k ) p k i (1 p i ) 10 k is maximized. (b) The posterior PMF of M is given by p M | X ( m | X = k 3 X i =1 p Θ | X ( θ i | X = k ) P ( M = m | X = k, Θ= θ i ) . The probabilities p Θ | X ( θ i | X = k ) were calculated in part (a), and the probabilities P ( M = m | X = θ i ) are binomial and can be calculated in the manner described in Problem 2(b). For k = 5, the posterior PMF can be explicitly calculated for m = 0 ,..., 5. The MAP and LMS estimates can be obtained from the posterior PMF. The probabilities p Θ | X ( θ i | X = k ) were calculated in part (a), p Θ | X ( θ 1 | X =5) 0 . 9010 Θ | X ( θ 2 | X 0 . 0989 Θ | X ( θ 3 | X 0 . 0001 . The probability that the student knows the answer to a question that she an- swered correctly is q i = θ i θ i θ i ) / 3 for i =1 , 2 , 3. The probabilities P ( M = m | X = θ i ) are binomial and are given by P ( M = m | X = θ i k m ³ q m i (1 q i ) k m For k = 5, the posterior PMF can be explicitly calculated for m =0 5 p M | X (0 | X 0 . 0145 , 91
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
p M | X (1 | X =5) 0 . 0929 , p M | X (2 | X 0 . 2402 , p M | X (3 | X 0 . 3173 , p M | X (4 | X 0 . 2335 , p M | X (5 | X 0 . 1015 , It follows that the MAP estimate is ˆ m =3 . The conditional expectation estimate is E [ M | X =5 ]= 5 X m =1 mp M | X ( m | X 2 . 9668 3 . Solution to Problem 8.5. According to the MAP rule, we need to maximize over θ [0 , 1] the posterior PDF f Θ | X ( θ | k )= f Θ ( θ ) p X | Θ ( k | θ ) Z f Θ ( θ 0 ) p X | Θ ( k | θ 0 ) 0 , where X is the number of heads observed. Since the denominator is a positive constant, we only need to maximize f Θ ( θ ) p X | Θ ( k | θ n k ± 2 4 ² ² ² 1 2 θ ² ² ² θ k (1 θ ) n k . The function to be minimized is di±erentiable except at θ =1 / 2. This leads to two di±erent possibilities: (a) the maximum is attained at θ / 2; (b) the maximum is attained at some θ< 1 / 2, at which the derivative is equal to zero; (c) the maximum is attained at some θ> 1 / 2, at which the derivative is equal to zero. Let us consider the second possibility. For 1 / 2, we have f Θ ( θ )=4 θ .T h e function to be maximized, ignoring the constant term 4 ( n k ) ,is θ k +1 (1 θ ) n k . By setting the derivative to zero, we ²nd ˆ θ =( k +1) / ( n + 1), provided that ( k + 1) / ( n < 1 / 2. Let us now consider the third possibility. For 1 / 2, we have f Θ ( θ )=4(1 θ ). The function to be maximized, ignoring the constant term 4 ( n k ) θ k (1 θ ) n k +1 .
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/11/2011 for the course MATH 170 taught by Professor Staff during the Spring '08 term at UCLA.

Page1 / 10

10 - The conditional expectation estimator is 1/5 E | X =...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online