{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Homework 8 Solution

Homework 8 Solution - EE 351K PROBABILITY& RANDOM...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 351K PROBABILITY & RANDOM PROCESSES FALL 2011 Instructor: Sujay Sanghavi [email protected] Homework 8 Solution Problem 1 Nefeli, a student in a probability class, takes a multiple-choice test with 10 questions and 3 choices per question. For each question. there are two equally likely possibilities, independent of other questions: either she knows the answer, in which case she answers the question correctly. or else she guesses the answer with probability of success 1/3. (a) Given that Nefeli answered correctly the first question, what is the probability that she knew the answer to that question? (b) Given that Nefeli answered correctly 6 out of the 10 questions, what is the posterior PMF of the number of questions of which she knew the answer? Sol : (a) It is easy to get that P ( know answer | answer is correct ) = P ( answer is correct | know answer ) P ( know answer ) P ( answer is correct ) = 1 2 × 1 1 2 × 1 + 1 3 × 1 2 = 3 4 . (b) Let random variable M to be the number of problems that Nefeli knew the answer. Let E be the event that 6 of answers are correct. Then we have M is binomial distributed with parameter (10 , 1 / 2) , so P ( M = m ) = ( 10 m ) ( 1 2 ) m ( 1 2 ) 10- m for ≤ m ≤ 10 . If given M = m , the conditional probability of E is given by P ( E | M = m ) = { m > 6 , ( 10- m 6- m ) ( 1 3 ) 6- m ( 2 3 ) 4 ≤ m ≤ 6 . So using total probability formula , we will have P ( E ) = 6 ∑ m =0 P ( E | M = m ) P ( M = m ) = ( 10 4 ) ( 2 3 ) 6 ( 1 3 ) 4 . Then using Bayes rule, we will have P ( M = m | E ) = { m > 6 , ( 6 m ) ( 3 4 ) m ( 1 4 ) 6- m ≤ m ≤ 6 . Also, you can get this PMF directly from the result that for every problem, the probability of knowing the answer given the answer is correct is given by 3/4 in part (a), so what we want is just binomial distributed with parameter (6 , 3 / 4) . Problem 2 Suppose points in R are being obtained from two classes, C 1 and C 2 , both of which are normally distributed with parameters (1 , 1) and (- 1 , 1) respectively. If it is known that the priors of C 1 and C 2 are 1/5 and 4/5 respectively, what is the Bayesian optimal decision boundary? Sol : We know that f X | C 1 ( x ) = 1 √ 2 π e- ( x- 1) 2 2 , f X | C 2 ( x ) = 1 √ 2 π e- ( x +1) 2 2 , P ( C 1 ) = 1 / 5 , P ( C 2 ) = 4 / 5 . So for the Bayesian boundary, we have 1 5 √ 2 π e- ( x- 1) 2 2 = 4 5 √ 2 π e- ( x +1) 2 2 ....
View Full Document

{[ snackBarMessage ]}

Page1 / 8

Homework 8 Solution - EE 351K PROBABILITY& RANDOM...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon bookmark
Ask a homework question - tutors are online