This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: MA2216/ST2131 Probability Notes 6 Joint Probability Distributions So far, we have only considered probability distributions for single ran dom variables. There will be situations, however, where we may find it desirable to record the simultaneous outcomes of several random variables. For examples, we might want to know simultaneously the temperature and the relative humidity at a specified time. A nurse may be interested in her patient’s height and weight. A student may be concerned with his or her scores for the final examinations. In this course we are primarily concerned with the bivariate case , that is, a pair of random variables at the same time. It is, however, a matter of an easy generalization to multivariate cases. § 1. Discrete Cases: 1. The joint probability density function , (joint p.d.f.), of a pair of discrete random variables X and Y is defined by f ( X,Y ) ( x,y ) = IP { X = x, Y = y } . 2. Properties: (i) f ( X,Y ) ( x,y ) ≥ . (ii) X x X y f ( X,Y ) ( x,y ) = 1 . (iii) For any region A in the xyplane, IP { ( X, Y ) ∈ A } = XX ( x, y ) ∈ A f ( X,Y ) ( x,y ) . (iv) f X ( x ) = IP( X = x ) = X y f ( X,Y ) ( x,y ) , which is called the marginal density function of X . Note that it is in fact the density function of X . (v) f Y ( y ) = IP( Y = y ) = X x f ( X,Y ) ( x,y ) , which is the marginal den sity function of Y . It is also the density function of Y . 1 3. Example: The joint probability density function of X and Y is given by f ( X,Y ) ( x,y ) = cxy, for x = 1 , 2 , 3; y = 1 , 2 , 3 . Find c , f X , f Y and IP { X + Y ≤ 3 } . Sol. To find the value of c , observe first that 1 = 3 X x =1 3 X y =1 cxy = c 3 X x =1 x " 3 X y =1 y # = 6 c 3 X x =1 x = 36 c. Thus, c = 1 / 36. Next, to find the marginals, we get f X ( x ) = 3 X y =1 f ( x,y ) = x 36 3 X y =1 y = x 6 , for x = 1 , 2 , 3, and in a similar fashion, f Y ( y ) = y 6 for y = 1 , 2 , 3. More precisely, one can tabulate the joint p.d.f. together with both marginals as follows: x \ y 1 2 3 IP { X = x } 1 1 36 1 18 1 12 1 6 2 1 18 1 9 1 6 1 3 3 1 12 1 6 1 4 1 2 IP { Y = y } 1 6 1 3 1 2 1 Finally, IP { X + Y ≤ 3 } = IP { (1 , 1) , (1 , 2) , (2 , 1) } = 1 36 [1 + 2 + 2] = 5 36 . Remark. In fact, X and Y are mutually “ independent ”, which will be clear in § 4. 2 § 2. Continuous Cases: 1. The function f ( X,Y ) ( x,y ) is called the joint probability density func tion (or joint density function ) of the pair of continuous random variables X and Y if (i) f ( X,Y ) ( x,y ) ≥ , (ii) Z ∞∞ Z ∞∞ f ( X,Y ) ( x,y ) dx dy = 1 , (iii) ∀ A ⊂ IR 2 , IP { ( X, Y ) ∈ A } = Z Z ( x, y ) ∈ A f ( X,Y ) ( x,y ) dxdy. 2. Properties: (i) Note that f X ( x ) = Z ∞∞ f ( X,Y ) ( x,y ) dy, which is called the marginal density function of X . Note that it is in fact the density function of X itself....
View
Full
Document
This note was uploaded on 03/19/2012 for the course SCIENCE ST2131 taught by Professor Forgot during the Fall '08 term at National University of Singapore.
 Fall '08
 Forgot

Click to edit the document details