Lecture04-2010 - Random Variables and Probability...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Random Variables and Probability Distributions: Lecture IV Charles B. Moss June 30, 2010 I. Conditional Probability and Independence A. In order to defne the concept oF a conditional probability it is necessary to discuss joint probabilities and marginal probabilities. 1. A joint probability is the probability oF two random events. ±or example, consider drawing two cards From the deck oF cards. There are 52x51=2,652 di²erent combinations oF the frst two cards From the deck. 2. The marginal probability is overall probability oF a single event or the probability oF drawing a given card. 3. The conditional probability oF an event is the probability oF that event given that some other event has occurred. a) In the textbook, what is the probability oF the die being a one iF you know that the Face number is odd? (1/3). b) However, note that iF you know that the role oF the die is a one, that the probability oF the role being odd is 1. B. Axioms oF Conditional Probability: 1. P ( A B ) 0 For any event A . 2. P ( A B ) = 1 For any event A B . 3. IF { A i B i } i =1 , 2 ,... are mutually exclusive, then P ( A 1 A 2 ... )= P ( A 1 B )+ P ( A 2 B (1) 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
AEB 6571 Econometric Methods I Professor Charles B. Moss Lecture IV Fall 2010 4. If B H , B G and P ( G ) 6 =0then P ( H B ) P ( G B ) = P ( H ) P ( G ) (2) C. Theorem 2.4.1: P ( A B )= P ( A B ) /P ( B ) for any pair of events A and B such that P ( B ) > 0. D. Theorem 2.4.2: ( Bayes Theorem ) Let Events A 1 ,A 2 ,...A n be mu- tually exclusive events such that P ( A 1 A 2 ∪··· A n )=1and P ( A i ) > 0fo rea ch i .L e t E be an arbitrary event such that P ( E ) > 0. Then P ( A i E P ( E A ) P ( A i ) n X j =1 P ( E A j ) P ( A j ) (3) 1. Another manifestation of this theorem is from the joint dis- tribution function: P ( E,A i P ( E A i P ( E A i ) P ( A i )( 4 ) 2. The bottom equality reduces the marginal probability of event E P ( E n X i =1 P ( E A i ) P ( A i 5 ) 3. This yields a friendlier version of Bayes theorem based on the ratio between the joint and marginal distribution function: P ( A i E P ( i ) P ( E ) (6) E. Statistical independence is when the probability of one random
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 07/15/2011 for the course AEB 6180 taught by Professor Staff during the Spring '10 term at University of Florida.

Page1 / 8

Lecture04-2010 - Random Variables and Probability...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online