19 Joint Dist II

19 Joint Dist II - Click to edit Master subtitle style...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Click to edit Master subtitle style 9/7/10 Joint Distributions April 12 9/7/10 I ndependence of Random Lets compare conditional and marginal distributions for this example by finding: PX|Y(X|Y=1), PX| PX|Y(X|Y=2) P(X=1|Y=2) = 0.05/0.2 = 0.25 P(X=2|Y=2) = 0.1/0.2 = 0.5 P(X=3|Y=2) = Y 1 2 3 1 .1 .05 .1 .25 X 2 .2 .1 .2 .5 3 .1 .05 .1 .25 .4 .2 .4 9/7/10 Independence Notice that in this example, all the conditional distributions and the marginal distribution of X are all the same! The reader can verify that the same is true for the conditional and marginal distributions of Y as well. What does this mean practically? It means that X is not effected by Y. No matter what the value of Y is, the 9/7/10 Independence More formally, two random variables with joint distribution PX,Y(x,y) are said to be independent if any of the following 3 (equivalent) conditions hold: 1. PX,Y(x,y) = P(X=x)*P(Y=y) for all x and y 2. PX|Y(x|Y=y) = P(X=x) for all x and y such that P(Y=y) > 0 9/7/10 Example Referring to the joint distribution given in the previous table, prove that X and Y are independent. The easiest way to prove this is to show that condition one in the definition of independence holds (i.e. the joint equals the product of the marginals). We can construct the product of the marginals as follows: Y 1 2 3 1 .1 .05 .1 .25 X 2 .2 .1 .2 .5 3 .1 .05 .1 .25 .4 .2 .4 9/7/10 Review We have defined Joint, Marginal and Conditional distributions, and independence: 1. Joint PX,Y(x,y) = P(X=x and Y=y) 1. Marginal: PX(x) = P(X=x) = PY(y) = P(Y=y) = P (x,y) X, Y y P (x,y) X, Y x 9/7/10...
View Full Document

This note was uploaded on 09/06/2010 for the course IE 111 taught by Professor Storer during the Spring '07 term at Lehigh University .

Page1 / 23

19 Joint Dist II - Click to edit Master subtitle style...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online