{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

5 the joint pmf of x y contains all information about

Info iconThis preview shows pages 5–8. Sign up to view the full content.

View Full Document Right Arrow Icon
5
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
The joint pmf of ( X, Y ) contains all information about the distribution of X and Y . In particular, we can extract f X and f Y , the pmfs for X and Y individually, from the joint pmf f XY . The individual pmfs f X and f Y are referred to as the marginal probability mass functions for X and Y . The marginal pmf of X evaluated at a point x is obtained by summing f XY ( x, y ) over all values of y , while the marginal pmf of Y evaluated at a point y is obtained by summing f XY ( x, y ) over all values of x : f X ( x ) = X y f XY ( x, y ) and f Y ( y ) = X x f XY ( x, y ) . A pair of random variables ( X, Y ) is said to be continuous if the probability of ( X, Y ) being equal to any fixed pair of values ( x, y ) is zero. In fact, if we want to be very precise, then continuity means a little more than this: the probability of ( X, Y ) lying in any subset of the plane that has zero area must be zero. When ( X, Y ) is continuous, its behavior may be described by its joint probability density function , or joint pdf, denoted f XY . If ( a, b ) and ( c, d ) are two intervals on the real line, then the probability that ( X, Y ) lies in the rectangle ( a, b ) × ( c, d ) – that is, the probability that X lies in ( a, b ) and Y lies in ( c, d ) – is equal to the volume underneath f X within the rectangle ( a, b ) × ( c, d ): P ( a < X < b and c < Y < d ) = Z d c Z b a f XY ( x, y )d x d y. Clearly, we must have f XY ( x, y ) 0 for all x and y , and the total volume under- neath the joint pdf must be one: R -∞ R -∞ f XY ( x, y )d x d y = 1. As in the discrete case, we may extract the pdfs f X and f Y for X and Y from the joint pdf f XY for ( X, Y ). The univariate pdfs f X and f Y are referred to as the marginal probability density functions , and may be obtained by “integrating out” one of the two arguments of f XY : f X ( x ) = Z -∞ f XY ( x, y )d y and f Y ( y ) = Z -∞ f XY ( x, y )d x. 6 Covariance and independence Given a discrete pair of random variables ( X, Y ) with joint pmf f XY , and a real valued function g of two variables, we may compute the expected value of g ( X, Y ) as follows: E ( g ( X, Y )) = X x X y g ( x, y ) f XY ( x, y ) . Similarly, if ( X, Y ) is continuous with joint pdf f XY , we have E ( g ( X, Y )) = Z -∞ Z -∞ g ( x, y ) f XY ( x, y )d x d y. 6
Background image of page 6
Recall that the covariance between any two random variables X and Y can be written as Cov( X, Y ) = E (( X - E ( X ))( Y - E ( Y ))) = E ( XY ) - E ( X ) E ( Y ) . Letting g ( x, y ) = xy , we may use the above expressions for E ( g ( X, Y )) to calculate E ( XY ). For discrete ( X, Y ) we have E ( XY ) = X x X y xyf XY ( x, y ) , implying that Cov( X, Y ) = X x X y xyf XY - X x xf X ( x ) ! X y yf Y ( y ) ! . For continuous ( X, Y ) we have E ( XY ) = Z -∞ Z -∞ xyf XY ( x, y )d x d y, yielding Cov( X, Y ) = Z -∞ Z -∞ xyf XY ( x, y )d x d y - Z -∞ xf X ( x )d x Z -∞ yf Y ( y )d y . Two random variables X and Y are said to be independent if their joint pmf (or joint pdf) factors into the product of their marginal pmfs (or marginal pdfs). That is, X and Y are independent if f XY ( x, y ) = f X ( x ) f Y ( y ) for all x and y.
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}