11-Joint Distributions

# 11-Joint Distributions - IE 111 Fall Semester 2009 Note Set...

This preview shows pages 1–3. Sign up to view the full content.

IE 111 Fall Semester 2009 Note Set #11 Joint Distributions This section covers joint distributions. Joint distributions allow us to deal with more than one random variable at the same time. It should be clear that we are often faced with more than one random variable, and thus need ways to model and analyze these situations. I also remind you that we have been introduced to joint distributions in the notes and home work problems earlier in the class. The easiest case to handle is when we have two random variables to deal with. Once one understands two random variables, the extension to more (higher dimensional) random variables is conceptually not too difficult. Two Discrete Random Variables The behavior of a random variable is completely described by its distribution. Similarly, the behavior of a pair of random variables is completely described by their joint distribution. The joint distribution of two discrete random variables (say X and Y) is defined as follows: P X,Y (x,y) = P(X=x and Y=y) Example Suppose the joint distribution of X and Y is given in the table below. Note that we have included row and column sums or “marginal sums” for convenience. Y 1 2 3 4 X 1 0.01 0.05 0.07 0.01 0.14 2 0.04 0.04 0.08 0.16 3 0.15 0.15 0.2 0.5 4 0.07 0.04 0.11 5 0.09 0.09 0.36 0.28 0.35 0.01 1 a) What do the probabilities in the table add up to? Like any probability distribution, the probabilities must add up to one. b) Find P(X=2 and Y=2) Answer: 0.04 c) Find P(X=4 and Y=1) Answer: 0.07 d) Find P(X=5 and Y=3) Answer 0 e) Find P(X=1). In this case we don’t care what Y equals, it can equal anything. P(X=1) = P(X=1 and Y=1) + P(X=1 and Y=2) + P(X=1 and Y=3) + P(X=1 and Y=4) We get this from the row sum of row one: Answer = 0.14 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
f) What is the distribution of X? Again this is without regard for the value of Y. The distribution for X is given in the column of marginal sums on the right. Thus P(X=2) = 0.16, P(X=3) = 0.5, etc. g) What is the distribution of Y? This is given in the row of marginal sums along the bottom. Thus P(Y=1) = 0.36, P(Y=2) = 0.28, etc. Marginal distributions Given the joint distribution of two random variables we can determine the individual distributions of each random variable alone. Again it is important to note that if we want the distribution of X, the value of Y could be anything. To get the distribution of X, we sum over the possible outcomes of Y since Y can be anything and all we care about is X. When we obtain the distribution of X or Y from a joint distribution by “summing out” the other, we call the distribution a “marginal distribution”. The reason for the name is clear since the distribution is given by the marginal sums in the “margins”. Conditional distributions
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 02/21/2010 for the course IE 111 taught by Professor Storer during the Spring '07 term at Lehigh University .

### Page1 / 12

11-Joint Distributions - IE 111 Fall Semester 2009 Note Set...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online