# In many situations random variables can take any

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: y from the deﬁnition of a probability are P (A) = 1 P (Ac ) P (B [ C ) = P (B ) + P (C ) (A.1) P (B \ C ) (A.2) To illustrate their use consider the following: Example A.3. Roll two dice and suppose for simplicity that they are red and green. Let A = “at least one 4 appears,” B = “a 4 appears on the red die,” and C = “a 4 appears on the green die,” so A = B [ C . Solution 1. Ac = “neither die shows a 4,” which contains 5 · 5 = 25 outcomes so (A.1) implies P (A) = 1 25/36 = 11/36. Solution 2. P (B ) = P (C ) = 1/6 while P (B \ C ) = P ({4, 4}) = 1/36, so (A.2) implies P (A) = 1/6 + 1/6 1/36 = 11/36. Conditional probability Suppose we are told that the event A with P (A) &gt; 0 occurs. Then the sample space is reduced from ⌦ to A and the probability that B will occur given that A has occurred is P (B |A) = P (B \ A)/P (A) (A.3) To explain this formula, note that (i) only the part of B that lies in A can possibly occur, and (ii) since the sample space is now A, we have to divide by P (A) to make P (A|A) = 1. Multiplying on ea...
View Full Document

## This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell University (Engineering School).

Ask a homework question - tutors are online