hw6_sol

# hw6_sol - ECE 302 Division 2 Homework 6 Solutions Problem 1...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECE 302 Division 2. Homework 6 Solutions. Problem 1. Let X and Y be independent random variables. Random variable X has a discrete uniform distribution over the set { 1 , 2 , 3 } , and Y has a discrete uniform distribution over the set { 1 , 3 } . Let V = X + Y , and W = X − Y . (a) Are V and W independent? Explain without calculations. Solution. No. Independence of V and W would imply that: p W | V ( w | v ) = p W ( w ) , which means the distribution of W cannot depend on the value of V . But in fact, unconditionally, W can have several different experimental values, whereas conditioned on V = 6, both X and Y must be 3 and so W = 0 with probability 1. Thus p W | V ( w | 6) negationslash = p W ( w ), and therefore V and W are dependent. (b) Find and plot p V ( v ). Also, determine E [ V ] and var( V ). (c) Find and show in a diagram p V,W ( v,w ). Solution. To visualize this situation, it is helpful to look at the joint PMF of X and Y . Since both X and Y are uniform, and since they are independent, each of the six possible pairs of their experimental values has probability 1/6. In addition to these probabilities, let us tabulate the corresponding experimental values v and w of random variables V and W : y \ x 1 2 3 1 1 / 6 v = 2 w = 0 1 / 6 v = 3 w = 1 1 / 6 v = 4 w = 2 3 1 / 6 v = 4 w = − 2 1 / 6 v = 5 w = − 1 1 / 6 v = 6 w = 0 From this table, it is easy to determine the joint probability mass function of V and W : To get the marginal PMF for V , we need to sum p V,W over w , for each v , i.e., take vertical sums in our picture of the joint PMF of V and W : p V ( v ) = 1 6 , v = 2 , 3 , 5 , 6 , 1 3 , v = 4 , , otherwise. The PMF of V is depicted in figure 2. Since it is symmetric about v = 4, we have: E [ V ] = 4. 1 1/6 1/6 1/6 1/6 1/6 1/6 1 2-2-1 2 3 5 6 w v 1 4 Figure 1: A sketch of the joint probability mass function of V and W : there are six equally likely pairs of values, each has probability 1/6. The two circled points comprise the event W > 0 considered in Part (d). 1 2 3 4 5 6 7 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 v p V (v) Figure 2: p V ( v ). Also, var( V ) = E [( V − E [ V ]) 2 ] = E [( V − 4) 2 ] = summationdisplay v ( v − 4) 2 p V ( v ) = 1 6 · [(2 − 4) 2 + (3 − 4) 2 + (5 − 4) 2 + (6 − 4) 2 ] + 1 3 · (4 − 4) 2 = 1 6 (4 + 1 + 1 + 4) = 5 3 . These five conditional probability mass functions are depicted in Figure 3. (d) Find E [ V | W > 0]. Solution. The event W > 0 corresponds to the two circled points in the picture of the joint PMF of V and W . Conditioned on this event, there are two equally likely values that V can assume: 3 and 4. The conditional expectation is therefore 3.5....
View Full Document

{[ snackBarMessage ]}

### Page1 / 7

hw6_sol - ECE 302 Division 2 Homework 6 Solutions Problem 1...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online