Slides04-2010 - Random Variables and Probability...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
Random Variables and Probability Distributions: Lecture IV Charles B. Moss August 28, 2010 Charles B. Moss () Random Variables and Probability Distributions: Lecture IV August 28, 2010 1 / 23
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Outline 1 Conditional Probability and Independence Axioms of Conditional Probability 2 Basic Concept of Random Variables Binomial Conditional Probabilities Uncorrelated Discrete Normal Uncorrelated Normal Conditional Probabilities Correlated Discrete Normal Correlated Normal Conditional Probabilities 3 Univariate Continuous Random Variables 4 Useful Univariate Distributions Charles B. Moss () Random Variables and Probability Distributions: Lecture IV August 28, 2010 2 / 23
Background image of page 2
Conditional Probability and Independence Conditional Probability and Indepence In order to defne the concept oF a conditional probability it is necessary to discuss joint probabilities and marginal probabilities. A joint probability is the probability oF two random events. ±or example, consider drawing two cards From the deck oF cards. There are 52x51=2,652 di²erent combinations oF the frst two cards From the deck. The marginal probability is overall probability oF a single event or the probability oF drawing a given card. The conditional probability oF an event is the probability oF that event given that some other event has occurred. In the textbook, what is the probability of the die being a one if you know that the face number is odd? (1/3). However, note that if you know that the role of the die is a one, that the probability of the role being odd is 1. Charles B. Moss () Random Variables and Probability Distributions: Lecture IV August 28, 2010 3 / 23
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Conditional Probability and Independence Axioms of Conditional Probability Axioms of Conditional Probability Axioms of Conditional Probability: 1. P ( A B ) 0fo ranyevent A . 2. P ( A B )=1foranyevent A B . 3. If { A i B i } i =1 , 2 ,... are mutually exclusive, then P ( A 1 A 2 ... )= P ( A 1 B )+ P ( A 2 B (1) 4. If B H , B G and P ( G ) 6 =0then P ( H B ) P ( G B ) = P ( H ) P ( G ) (2) Charles B. Moss () Random Variables and Probability Distributions: Lecture IV August 28, 2010 4 / 23
Background image of page 4
Conditional Probability and Independence Axioms of Conditional Probability Theorem 2.4.1: P ( A B )= P ( A B ) / P ( B ) for any pair of events A and B such that P ( B ) > 0. Theorem 2.4.2: ( Bayes Theorem ) Let Events A 1 , A 2 ,... A n be mutually exclusive events such that P ( A 1 A 2 ∪··· A n )=1and P ( A i ) > 0 for each i .Le t E be an arbitrary event such that P ( E ) > 0. Then P ( A i E P ( E A ) P ( A i ) n X j =1 P ( E A j ) P ( A j ) (3) Charles B. Moss () Random Variables and Probability Distributions: Lecture IV August 28, 2010 5 / 23
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Conditional Probability and Independence Axioms of Conditional Probability Another manifestation of this theorem is from the joint distribution
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 23

Slides04-2010 - Random Variables and Probability...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online