slides17 - Lecture Stat 302 Introduction to Probability -...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture Stat 302 Introduction to Probability - Slides 17 AD March 2010 AD () March 2010 1 / 11
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Jointly Distributed Random Variables Assume we have two r.v. X and Y F ( a , b ) = P ( X a , Y b ) The c.d.f of X is F X ( a ) = P ( X a , Y ) = lim b ! P ( X a , Y b ) = lim b ! F ( a , b ) Similarly we have that the c.d.f. of Y is F Y ( b ) = P ( X , Y b ) = lim a ! F ( a , b ) AD () March 2010 2 / 11
Background image of page 2
Consider P ( X > a , Y > b ) = 1 P ( f X > a , Y > b g c ) = 1 P ( f X > a , Y > b g c ) = 1 P ( f X > a g c [f Y > b g c ) = 1 P ( f X ± a g[f Y ± b g ) = 1 [ P ( X ± a ) + P ( Y ± b ) P ( X ± a , Y ± b )] = 1 F X ( a ) F X ( b ) + F ( a , b ) . For discrete variables, we work directly with the joint pmf p ( x , y ) = P ( X = x , Y = y ) from which we obtain p X ( x ) = y p ( x , y ) , p Y ( y ) = x p ( x , y ) . AD ()
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/21/2010 for the course STAT Stat302 taught by Professor 222 during the Spring '10 term at The University of British Columbia.

Page1 / 11

slides17 - Lecture Stat 302 Introduction to Probability -...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online