Slides04-2010

# Slides04-2010 - Outline Conditional Probability and...

This preview shows pages 1–6. Sign up to view the full content.

Outline Conditional Probability and Independence Useful Distribution Functions Transformation of Random Variables Triangular Distribution Function Conditional Probability and Distribution Functions: Lecture IV Charles B. Moss August 27, 2010 Charles B. Moss Conditional Probability and Distribution Functions: Lecture IV

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Outline Conditional Probability and Independence Useful Distribution Functions Transformation of Random Variables Triangular Distribution Function Conditional Probability and Independence Bayes Theory Useful Distribution Functions Normal Distributions Uniform Distributions Gamma Distribution Transformation of Random Variables Inverse Hyperbolic Sine Triangular Distribution Function Charles B. Moss Conditional Probability and Distribution Functions: Lecture IV
Outline Conditional Probability and Independence Useful Distribution Functions Transformation of Random Variables Triangular Distribution Function Bayes Theory Conditional Probability and Independence I In order to defne the concept oF a conditional probability it is necessary to defne joint and marginal probabilities. I The joint probability is the probability oF a particular combination oF two or more random variables. I Taking the role oF two die as an example, the probability oF rolling a 4 on one die and a 6 on the other die is 1/36. I There are 36 possible outcomes oF the two die { 1 , 1 } , { 1 , 2 } , ··· { 2 , 1 } , { 2 , 2 } , 6 , 6 } . I ThereFore the probability oF a { 4 , 6 } given that the die are Fair is 1 / 36. Charles B. Moss Conditional Probability and Distribution Functions: Lecture IV

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Outline Conditional Probability and Independence Useful Distribution Functions Transformation of Random Variables Triangular Distribution Function Bayes Theory I The marginal probability is the probability one of the random variables irrespective of the outcome of the other variable. I Going back to the die example, there are six diFerent rolls of thed iewheretheva lueo fthe±rstd ieis4 { 4 , 1 } , { 4 , 2 } , { 4 , 3 } , { 4 , 4 } , { 4 , 5 } , { 4 , 6 } (1) I Hence, again assume that the die are fair the marginal probability of x 1 =4is P [ x 1 =4]= P [ { 4 , 1 } ]+ P [ { 4 , 2 } P [ { 4 , 3 } P [ { 4 , 4 } P [ { 4 , 5 } P [ { 4 , 6 } ] = 1 36 + 1 36 + 1 36 + 1 36 + 1 36 + 1 36 = 6 36 = 1 6 (2) Charles B. Moss Conditional Probability and Distribution Functions: Lecture IV
Outline Conditional Probability and Independence Useful Distribution Functions Transformation of Random Variables Triangular Distribution Function Bayes Theory I The conditional probability is then the probability of one event, such as the probability that the Frst die is a 4, given that the value of another random variable is known, such as the fact that the value of the second die roll is equal to 6. In the forgoing example, the case of the fair die, this value is 1/6.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 21

Slides04-2010 - Outline Conditional Probability and...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online