This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Suppose P ( E  F ) = P ( E ), i.e., knowing F doesn’t help in predicting E . Then E and F are indepen dent. What we have said is that in this case P ( E  F ) = P ( E ∩ F ) P ( F ) = P ( E ) , or P ( E ∩ F ) = P ( E ) P ( F ). We use the latter equation as a definition: We say E and F are independent if P ( E ∩ F ) = P ( E ) P ( F ) . Example. Suppose you flip two coins. The outcome of heads on the second is independent of the outcome of tails on the first. To be more precise, if A is tails for the first coin and B is heads for the second, and we assume we have fair coins (although this is not necessary), we have P ( A ∩ B ) = 1 4 = 1 2 · 1 2 = P ( A ) P ( B ). Example. Suppose you draw a card from an ordinary deck. Let E be you drew an ace, F be that you drew a spade. Here 1 52 = P ( E ∩ F ) = 1 13 · 1 4 = P ( E ) ∩ P ( F ). Proposition 3.1. If E and F are independent, then E and F c are independent. Proof. P ( E ∩ F c ) = P ( E ) P ( E ∩ F ) = P ( E ) P ( E ) P ( F ) = P ( E )[1 P ( F )] = P ( E ) P ( F c ) ....
View
Full Document
 Fall '06
 SCHWAGER
 Probability, Probability theory, Randomness

Click to edit the document details