lecture24.pdf

# lecture24.pdf - COMPSCI 240 Reasoning Under Uncertainty...

• Notes
• 128

This preview shows pages 1–16. Sign up to view the full content.

COMPSCI 240: Reasoning Under Uncertainty Arya Mazumdar University of Massachusetts at Amherst Fall 2016

This preview has intentionally blurred sections. Sign up to view the full version.

Lecture 24: Review
Model of Probability A Sample Space Ω Probability Law: A Ω; P ( A ) Events Sets Modeling: More likely event to get more probability

This preview has intentionally blurred sections. Sign up to view the full version.

Axioms of Probability Nonnegativity: P ( A ) 0 Additivity: For any two disjoint sets A , B , P ( A B ) = P ( A ) + P ( B ) Holds for infinitely many disjoints events A 1 , A 2 , A 3 , . . . P ( i A i ) = X i P ( A i ) . Normalization: P (Ω) = 1
Basic identities P ( ) = 0 P ( A c ) = 1 - P ( A ) If A B , then P ( A ) P ( B ). P ( A B ) = P ( A ) + P ( B ) - P ( A B ) Sub-additivity: P ( A B ) P ( A ) + P ( B )

This preview has intentionally blurred sections. Sign up to view the full version.

New probability space P ( ·| B ) Define, P ( A | B ) = P ( A B ) P ( B ) . In the case of disjoint A and B , A B = . Which means, P ( A B ) = 0. So P ( A | B ) = 0. Supports our intuition.
New probability space P ( ·| B ) Conditional probability: a valid probability measure Axioms of probability are satisfied. New sample space is B . Nonnegativity: P ( A | B ) 0 Normalization: P ( B | B ) = P ( B B ) P ( B ) = P ( B ) P ( B ) = 1.

This preview has intentionally blurred sections. Sign up to view the full version.

Multiplication Rule Shorthand: P ( n i =1 A i ) P ( A 1 A 2 . . . A n ) The Rule: P ( n i =1 A i ) = P ( A 1 ) P ( A 2 | A 1 ) P ( A 3 | A 1 A 2 ) . . . P ( A n | ∩ n - 1 i =1 A i )
Law of total probability Let A 1 , A 2 , . . . , A n partition Ω and P ( A i ) > 0 P ( B ) = P ( A 1 B ) + P ( A 2 B ) + · · · + P ( A n B ) = P ( A 1 ) P ( B | A 1 ) + P ( A 2 ) P ( B | A 2 ) + · · · + P ( A n ) P ( B | A n ) . Recall the disjoint additivity rule.

This preview has intentionally blurred sections. Sign up to view the full version.

Bayes’ Rule Let A 1 , A 2 , . . . , A n partition Ω and P ( A i ) > 0. For any B such that P ( B ) > 0, P ( A i | B ) = P ( A i ) P ( B | A i ) P ( B ) = P ( A i ) P ( B | A i ) P ( A 1 ) P ( B | A 1 ) + P ( A 2 ) P ( B | A 2 ) + · · · + P ( A n ) P ( B | A n )
Discrete Probability Laws If Ω is finite and all outcomes are equally likely, then P ( A ) = | A | | Ω | .

This preview has intentionally blurred sections. Sign up to view the full version.

Discrete Probability Laws If Ω is finite and all outcomes are equally likely, then P ( A ) = | A | | Ω | . Sometimes it’s challenging to compute | A | and | Ω | and they are too large work out by hand. . .
Shortcuts for Counting Permutations: There are n ! = n × ( n - 1) × . . . × 2 × 1 ways to permute n objects. E.g., permutations of { a , b , c } are abc , acb , bac , bca , cab , cba

This preview has intentionally blurred sections. Sign up to view the full version.

Shortcuts for Counting Permutations: There are n ! = n × ( n - 1) × . . . × 2 × 1 ways to permute n objects. E.g., permutations of { a , b , c } are abc , acb , bac , bca , cab , cba k -Permutations: There are n × ( n - 1) × . . . × ( n - k + 1) ways to choose the first k elements of a permutation of n objects. E.g., 2-permutations of { a , b , c , d } are ab , ac , ad , ba , bc , bd , ca , cb , cd , da , db , dc
Shortcuts for Counting Permutations: There are n ! = n × ( n - 1) × . . . × 2 × 1 ways to permute n objects. E.g., permutations of { a , b , c } are abc , acb , bac , bca , cab , cba k -Permutations: There are n × ( n - 1) × . . . × ( n - k + 1) ways to choose the first k elements of a permutation of n objects. E.g., 2-permutations of { a , b , c , d } are ab , ac , ad , ba , bc , bd , ca , cb , cd , da , db , dc Combinations: There are ( n k ) = n ! k !( n - k )!

This preview has intentionally blurred sections. Sign up to view the full version.

This is the end of the preview. Sign up to access the rest of the document.
• Fall '12
• Ben
• Probability theory, Markov chain

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern