{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ReviewPart1 - i%$bSw_s SyD wdDSRs8jSRscfw_SstISBAsA$j$ k...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
g105g130g37g36g130g98g83g119g95g115g132;#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23g83g121g68 g119g100g68g83g82g115g56g125g106g130g83g82g125g115g123g130g83g115g116g73g83g125g132g66g65g115g65g36g106g36;#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23g107 g119g49g68g83g73g36g82g123g132g130;#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23g130g83g132g115g116g73g66g56g83g37g115g132g36g115g65g106g130g82 g119g110g68g83g65g115g82g36g123g83g123g66g116;#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23g36g116g90g66g90g82g83g132g115g116g73g66g56g83g37g115g132g36g115g65g106g130g83g123g66g116g123g130g125;#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23#23g82 Probabilistic Models Consist of: 1. The sample space , Ω: the set of all possible outcomes of an experiment. 2. The probability law : assigns to each event , which is a set A of possible outcomes, a nonnegative number P ( A ) , called the probability of A . Event A Event B Events A B Sample Space Probability Law Experiment P ( A ) P ( B ) 2
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Probability Axioms 1. Nonnegativity : P ( A ) 0 , for every event A. 2. Additivity : If A and B are disjoint events, then P ( A B ) = P ( A ) + P ( B ) . 3. Normalization : P (Ω) = 1 . 3 Some Properties of Probability Laws These can be deduced from the axioms and verified using Venn Diagrams. Let A , B and C be events. 1. If A B , then P ( A ) P ( B ) . 2. P ( A B ) = P ( A ) + P ( B ) - P ( A B ) . 3. P ( A B ) P ( A ) + P ( B ) . 4. P ( A B C ) = P ( A ) + P ( A c B ) + P ( A c B c C ) . 4
Background image of page 2
Discrete Models Discrete Probability Law : If Ω is finite, then any event has the form A = { s 1 , s 2 , . . . , s n } , where s i Ω , and, P ( A ) = P ( s 1 ) + P ( s 2 ) + ··· + P ( s n ) . Discrete Uniform Probability Law : If Ω consists of n equally likely outcomes, then, P ( A ) = number of elements of A n . 5 Conditional Probability Given B , with P ( B ) > 0 , the conditional probability of A is P ( A | B ) = P ( A B ) P ( B ) . P ( A | B ) is a new probability law on Ω and satisfies the probability axioms. 1. P ( A | B ) 0 . 2. P | B ) = 1 . 3. P ( A 1 A 2 | B ) = P ( A 1 | B ) + P ( A 2 | B ) , when A 1 A 2 = . P ( A | B ) can be viewed as a probability law on a new universe, B , where all the conditional probability is concentrated. 6
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Multiplication Rule For finding (unconditional) probabilities of events when an experiment has a sequential nature and conditional probabilities are known. Assuming all conditioning events have positive probabilities: P ( n i =1 A i ) = P ( A 1 ) P ( A 2 | A 1 ) P ( A 3 | A 1 A 2 ) . . . P ( A n |∩ n - 1 i =1 A i ) . The possible outcomes of the sequential experiment can be represented using a tree. The probability of the leaf (event) is the product of the probabilities along the traversed path.
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}