This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 2 Bayes Rule 2.1 Introduction Here is a basic exposition of Bayes rule. Suppose you have events E 1 ,...,E k that form a partition of the sample space. One is given 1. P ( E i ) ,i = 1 ,...,k 2. P ( A  E i ) ,i = 1 ,...,k One is interested in computing the probabilities P ( E i  A ) ,i = 1 ,...,k . By a standard manipulation of conditional probabilities, one obtains the result: P ( E i  A ) = P ( E i ) P ( A  E i ) ∑ k j =1 P ( E j ) P ( A  E j ) . 2.2 Illustrations of Bayes’ Rule 2.2.1 Example: Student Takes a Test Suppose a student is taking a onequestion multiple choice test that has four possible choices. Either the student knows the material or she doesn’t; we denote these two possibilities by K and “not K ”. Based on previous work, the teacher decides the student likely knows the material and so assigns P ( K ) = 0 . 7. Therefore, the probability the student doesn’t know the ma terial is P (not K ) = 1 . 7 = 0 . 3 . The student will take the onequestion test and either she will get it correct, which we denote by C . If the student knows the material, then the chance she will get the question correct is 90%. On the other hand, if the student doesn’t know the material, then we will guess and obtain the correct answer with probability 25%. Suppose the student takes the test and gets the question correct – what is the probability she really knows the material? 2 2 Bayes Rule Here the events K and “not K ” form a partition of the sample space and we are given the probabilities of these two events. The probability the student gets the question correct depends on whether she knows the material – we are given that P ( C  K ) = 0 . 9 , P ( C  not K ) = 0 . 25 . Given that the student gets the question correct, we’re interested in deter mining the probability of K ; that is, we wish to compute P ( K  C ). By Bayes’ rule, this is given by P ( K  C ) = P ( K ) P ( C  K ) P ( K ) P ( C  K ) + P (not K ) P ( C  not K ) . Substituting in the given values, we obtain P ( K  C ) = . 7 × . 9 . 7 × . 9 + 0 . 3 × . 25 = . 63 . 63 + 0 . 075 = 0 . 894 . Does this answer make sense? Before the test, the teacher believed that the student knew the material with probability 0.7. The student got the question correct which intuitively should increase the teacher’s probability that the student was a good student. Bayes’ rule allows us to explicitly compute how the probability should increase – the probability has increased from P ( K ) = . 7 to P ( K  C ) = 0 . 894. 2.2.2 Example: Balls in a Bag Suppose a bag contains exactly one white ball. You roll a die and if the outcome of the die roll is i , you add i red balls to the bag. You then select a ball from the bag and the color of the ball is red. What is the chance that the die roll is i ?...
View
Full Document
 Spring '10
 albert
 Conditional Probability, Probability

Click to edit the document details