Lecture04-2010

# Moss lecture iv fall 2010 p x1 4 p 4 1 p

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Professor Charles B. Moss Lecture IV Fall 2010 P [x1 = 4] = P [{ 4, 1} ] + P [{ 4, 2} ] + P [{ 4, 3} ] + P [ { 4, 4} ] + P [ { 4, 5} ] + P [ { 4, 6} ] 1 1 1 1 1 1 = + + + + + 36 36 36 36 36 36 6 1 = = 36 6 (2) C. The conditional probability is then the probability of one event, such as the probability that the ﬁrst die is a 4, given that the value of another random variable is known, such as the fact that the value of the second die roll is equal to 6. In the forgoing example, the case of the fair die, this value is 1/6. 1. Deﬁnition 2.10 p-18 Given that A and B are sets deﬁned on C the Axioms of Conditional Probability are a. P [A|B ] ≥ 0 for all A, b. P [A|A] = 1 , c. If { Ai ∩ Bi } are mutually exclusive events P [A1 ∪ A2 ∪ · · · An |B ] = P [A1 |B ]+P [A2 |B ]+· · · P [An |B ] (3) d. If B ⊃ H , B ⊃ G, and P [G] ￿= 0, then P [H |B ] P [H ] = P [ G| B ] P [ G] (4) P [A ∩ B ] P [B ] (5) 2. Discussion of Deﬁnition 2.10. a. The ﬁrst 3 conditions follow the general axioms of probability theory. b. The ﬁnal condition states that the relative probability of a c...
View Full Document

## This note was uploaded on 02/01/2012 for the course AEB 6182 taught by Professor Weldon during the Fall '08 term at University of Florida.

Ask a homework question - tutors are online