chap04 - Chapter 4 Conditional Probability and Independence...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 4 Conditional Probability and Independence In the context of a random experiment, knowing that a certain event B has occured may completely change the likelihood we associate to another event A . For example, suppose we roll two fair dice:- The sample space is S = { ( x, y ) : x, y { 1 , 2 , ..., 6 }} .- Let A denote the event that the sum x + y = 11, i.e., A = { (5 , 6) , (6 , 5) } , and let B denote the event that x = 1, i.e. B = { (1 , 1) , (1 , 2) , ..., (1 , 6) } .- Assuming that the dice are fair, the probability of A is P ( A ) = 2 / 36.- Now, suppose we know that B occurred, i.e. the first die shows 1.- Under this condition, event A is impossible, and its likelihood or probability becomes 0. 83 84 Conditional probabilities provide quantitative measures of likelihood (probability) under the assumption that certain events have occurred, or equivalently, that certain a priori knowledge is available. In certain situations, knowing that B has occurred does not change the likelihood of A ; this idea is formalized via the mathematical concept of independence. The concepts of conditional probability and independence play a ma- jor role in the design and analysis of modern information processing systems, such as digital radio receivers, speech recognition systems, file compression algorithms, etc. c 2003 Beno t Champagne Compiled February 2, 2012 4.1 Conditional probability 85 4.1 Conditional probability Relative frequency interpretation: Consider a random experiment. Let A and B denote two events of interest with P ( B ) > 0. Suppose this experiment is repeated a large number of times, say n . According to the relative frequency interpretation of probability, we have P ( A ) ( A ) n , P ( B ) ( B ) n , P ( A B ) ( A B ) n (4.1) where ( A ), ( B ) and ( A B ) denote the number of occurrences of events A , B and A B within the n repetitions. Provided ( B ) is large, the probability of A , knowing or given that B has occurred, might be evaluated as the ratio P ( A given B ) = ( A B ) ( B ) , (4.2) also known as a conditional relative frequency . Using this approach, we have P ( A given B ) = ( A B ) ( B ) = ( A B ) /n ( B ) /n P ( A B ) P ( B ) (4.3) This and other considerations lead to the following definition. c 2003 Beno t Champagne Compiled February 2, 2012 4.1 Conditional probability 86 Definition: Consider a random experiment ( S, F , P ). Let B F and assume that P ( B ) > 0. For every A F , the conditional probability of A given B , denoted P ( A | B ), is defined as P ( A | B ) = P ( A B ) P ( B ) (4.4) Remarks: This definition extends the above concept of conditional relative fre- quency to the axiomatic probability framework....
View Full Document

Page1 / 34

chap04 - Chapter 4 Conditional Probability and Independence...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online