14 further the marginal pdfs are f x x 1 6 xy 2 dy 2

Info iconThis preview shows pages 14–24. Sign up to view the full content.

View Full Document Right Arrow Icon
14
Background image of page 14

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Further, the marginal PDFs are f X x 0 1 6 xy 2 dy 2 x 0 1 3 y 2 dy 2 x f Y y 0 1 6 xy 2 dx 3 y 2 0 1 2 xdx 3 y 2 15
Background image of page 15
Computing joint probabilities is done by integrating the joint distribution over the appropriate region. Suppose we want P X Y 1 . The event of interest is A  x , y : x y 1,0 x y 1  x , y :0 x 1 y ,0 y 1 and so we integrate the joint density over this region: 16
Background image of page 16

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
P X Y 1 0 1 0 1 y 6 xy 2 dxdy 0 1 0 1 y 2 xdx 3 y 2 dy 0 1 3 1 y 2 y 2 dy 3 0 1 y 2 2 y 3 y 4 3 1 3 2 4 1 5 1 10 17
Background image of page 17
It is easy to find different joint distributions that have the same marginal distributions. In other words, knowing the marginals is not generally enough to recover the joint distribution. In the general case with m random variables, the marginal PDF for any random variable is obtained by summing or integrating out over the dummy arguments for the other m 1 random variables. For example, with three discrete random variables X , Y , and Z , f X x j 1 h 1 f X , Y , Z x , y j , z h 18
Background image of page 18

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2 . Independent Random Variables In an important case, knowing the marginal distributions of a set of random variables is enough to determine the joint distribution. DEFINITION : Random variables X 1 , X 2 ,..., X m aresaidtobe independent if P X 1 A 1 X m A m P X 1 A 1  P X m A m for all events (Borel sets) A 1 A m . 19
Background image of page 19
It is known that it is sufficient for this equality to hold for intervals − , x j . In other words, a necessary and sufficient condition for independence of X 1 , X 2 ,..., X m is that the joint CDF can be obtained as the product of the marginal CDFs: F X x 1 x m F X 1 x 1  F X m x m all x j An important advantage of having independent random variables is that we can compute any joint probability by computing the corresponding marginal probabilities and then multiplying them together. 20
Background image of page 20

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
With two RVs, independence is P X A , Y B P X A P Y B for all events A and B ,or F X , Y x , y F X x F Y y ,all x , y 21
Background image of page 21
It is often easier to characterize independence in terms of PDFs. X and Y are independent if and only if f X , Y x , y f X x f Y y ,all x , y This characterization works for discrete, continuous, or partially continuous RVs. 22
Background image of page 22

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
EXAMPLE ( Discrete ): Recall the example of rolling two fair dice with X D 1 D 2 and Y max D 1 , D 2 . Using the table we constructed, P X 4, Y 3 1/18 while P X 4 P Y 3 1/12  5/36 5/432.
Background image of page 23
Image of page 24
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page14 / 107

14 Further the marginal PDFs are f X x 1 6 xy 2 dy 2 x 1 3...

This preview shows document pages 14 - 24. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online