elemprob-fall2010-page24

elemprob-fall2010-page24 - X and the density of Y are...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
10 Joint distributions Given two discrete random variables X and Y , we can talk about the joint distribution or joint density: P ( X = x,Y = y ) . Here the comma means “and” and this is an abbreviation for P (( X = x ) ( Y = y )) . An example: Suppose we roll two dice, X is the number on the first die, and Y is the total. We say X and Y are independent if P ( X = x,Y = y ) = P ( X = x ) P ( Y = y ) for all x and y . Given the joint density, we can find the density of X and Y by summing: P ( X = x ) = X y P ( X = x,Y = y ) , P ( Y = y ) = X x P ( X = x,Y = y ) . If we are given the joint density of X and Y , then the density of
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: X and the density of Y are called the marginal densities or marginal distributions. If we know the joint density of X and Y , we can talk about the conditional distribution (density) of X given Y = y . This is the expression P ( X = x | Y = y ) = P ( X = x,Y = y ) P ( Y = y ) . Just as E g ( X ) = ∑ x g ( x ) P ( X = x ), we have E g ( X,Y ) = X x,y g ( x,y ) P ( X = x,Y = y ) . 24...
View Full Document

This note was uploaded on 12/29/2011 for the course MATH 316 taught by Professor Ansan during the Spring '10 term at SUNY Stony Brook.

Ask a homework question - tutors are online