This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **NOTES ON CONDITIONAL PROBABILITY, BAYES’ THEOREM, AND INDEPENDENCE by A. Ledoan The ideas of conditional probability and independence play a central role in the study of random phenomena. Conditional probability is the study of how additional information can alter our notion of how likely another event is to occur. Also, the notion of independence has been used informally in Chapter 2 of Ross [1]. One of our tasks here will be to formalize it. 1. Conditional Probability The probability of an event E is, in a sense, a measure of our degree of ignorance concerning the outcome. Thus, if we have some information about the outcome (for instance, a stockmarket investor who has picked up an inside tip), it may alter the probability of E . Suppose that a random experiement E is performed and assume further that the event F was known to occur. What is the subsequent probability of E ? How does the partial knowledge revise its probability? Suppose that the random experiment E is performed a large number of times, say N . Assume that of these N repetitions F occurred n ( F ) times and EF occurred n ( EF ) times. (Here and elsewhere we use the abbreviation EF = E ∩ F .) Conse- quently, E occurred n ( EF ) /n ( F ) times among all the times n ( F ) that F occurred. If N is large, then n ( F ) will be large as well (if we assume that P ( F ) > 0.) Note that the ratio n ( EF ) /n ( F ) will represent the probability that E occurred given knowledge of the fact that F occurred. Moreover, from the frequency interpreta- tion of probability we derive that lim N →∞ n ( EF ) n ( F ) = lim N →∞ n ( EF ) /N n ( F ) /N ∼ = P ( EF ) P ( F ) . This is our intuitive definition of conditional probability and it implies that the ap- proximation will be accurate for large N . For these reasons, and more, conditional probability is defined in the following manner. Definition 1. For any events E and F defined on a sample space S such that P ( F ) > , the conditional probability of E , assuming that F has already oc- curred, is defined by P ( E | F ) = P ( EF ) P ( F ) . (1.1) If P ( F ) = 0 , then P ( E | F ) is undefined. The following examples are typical applications of (1.1). Example 1. Find the probability that a poker hand contains 2 jacks if you already know it contains 1 ace. Solution. By Definition 1, P (2 jacks | 1 ace) = P (2 jacks and 1 ace) P (1 ace) = ( 4 2 ) 4 ( 44 2 ) / ( 52 5 ) 4 ( 48 4 ) / ( 52 5 ) = ( 4 2 )( 44 2 ) ( 48 4 ) . Example 2. Draw 2 cards without replacement. Find the probability of 2 jacks given that at least one of the cards is a face. Example 3. Consider the families with two children. Assuming that all outcomes are equally likely, what is the probability that both children are boys given that at least one is a boy? 2 Notes on Conditional Probability, Bayes’ Theorem, and Independence Example 4. Two fair dice are rolled. What is the probability that (a) the first one lands on 6 given that the sum of the dice is 8, and (b) at least one lands on 6...

View
Full
Document