# 10 - The conditional expectation estimator is 1/5 E | X...

This preview shows pages 1–3. Sign up to view the full content.

The conditional expectation estimator is E Θ | X = (30 , 25 , 15 , 40 , 20) = 1 / 5 0 θ 7 e 130 θ 1 / 5 0 ( θ ) 6 e 130 θ . Solution to Problem 8.4. (a) Let X denote the random variable representing the number of questions answered correctly. For each value θ ∈ { θ 1 , θ 2 , θ 3 } , we have using Bayes’ rule, p Θ | X ( θ | k ) = p Θ ( θ ) p X | Θ ( k | θ ) 3 i =1 p Θ ( θ i ) p X | Θ ( k | θ i ) . The conditional PMF p X | Θ is binomial with n = 10 and probability of success p i equal to the probability of answer correctly a question, given that the student is of category i , i.e., p i = θ i + (1 θ i ) · 1 3 = 2 θ i + 1 3 . Thus we have p 1 = 1 . 6 3 , p 2 = 2 . 4 3 , p 3 = 2 . 9 3 . For a given number of correct answers k , the MAP rule selects the category i for which the corresponding binomial probability ( 10 k ) p k i (1 p i ) 10 k is maximized. (b) The posterior PMF of M is given by p M | X ( m | X = k ) = 3 i =1 p Θ | X ( θ i | X = k ) P ( M = m | X = k, Θ = θ i ) . The probabilities p Θ | X ( θ i | X = k ) were calculated in part (a), and the probabilities P ( M = m | X = k, Θ = θ i ) are binomial and can be calculated in the manner described in Problem 2(b). For k = 5, the posterior PMF can be explicitly calculated for m = 0 , . . . , 5. The MAP and LMS estimates can be obtained from the posterior PMF. The probabilities p Θ | X ( θ i | X = k ) were calculated in part (a), p Θ | X ( θ 1 | X = 5) 0 . 9010 , p Θ | X ( θ 2 | X = 5) 0 . 0989 , p Θ | X ( θ 3 | X = 5) 0 . 0001 . The probability that the student knows the answer to a question that she an- swered correctly is q i = θ i θ i + (1 θ i ) / 3 for i = 1 , 2 , 3. The probabilities P ( M = m | X = k, Θ = θ i ) are binomial and are given by P ( M = m | X = k, Θ = θ i ) = k m q m i (1 q i ) k m For k = 5, the posterior PMF can be explicitly calculated for m = 0 , . . . , 5 p M | X (0 | X = 5) 0 . 0145 , 91

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
p M | X (1 | X = 5) 0 . 0929 , p M | X (2 | X = 5) 0 . 2402 , p M | X (3 | X = 5) 0 . 3173 , p M | X (4 | X = 5) 0 . 2335 , p M | X (5 | X = 5) 0 . 1015 , It follows that the MAP estimate is ˆ m = 3 . The conditional expectation estimate is E [ M | X = 5] = 5 m =1 mp M | X ( m | X = 5) 2 . 9668 3 . Solution to Problem 8.5. According to the MAP rule, we need to maximize over θ [0 , 1] the posterior PDF f Θ | X ( θ | k ) = f Θ ( θ ) p X | Θ ( k | θ ) f Θ ( θ ) p X | Θ ( k | θ ) , where X is the number of heads observed. Since the denominator is a positive constant, we only need to maximize f Θ ( θ ) p X | Θ ( k | θ ) = n k 2 4 1 2 θ θ k (1 θ ) n k . The function to be minimized is differentiable except at θ = 1 / 2. This leads to two different possibilities: (a) the maximum is attained at θ = 1 / 2; (b) the maximum is attained at some θ < 1 / 2, at which the derivative is equal to zero; (c) the maximum is attained at some θ > 1 / 2, at which the derivative is equal to zero. Let us consider the second possibility. For θ < 1 / 2, we have f Θ ( θ ) = 4 θ . The function to be maximized, ignoring the constant term 4 ( n k ) , is θ k +1 (1 θ ) n k . By setting the derivative to zero, we find ˆ θ = ( k + 1) / ( n + 1), provided that ( k + 1) / ( n + 1) < 1 / 2. Let us now consider the third possibility. For θ > 1 / 2, we have f Θ ( θ ) = 4(1 θ ). The function to be maximized, ignoring the constant term 4 ( n k ) , is θ k (1 θ ) n k +1 .
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern