Unformatted text preview: Z and T . 3. (20 Points) Let X and Y be two jointly continuous random variables. Show that E [ X + Y ] = E [ X ] + E [ Y ] no matter X and Y are independent or not. 4. (20 Points) Let X and Y be two independent Gaussian random variables with parameters ( μ X ,σ 2 X ) and ( μ Y ,σ 2 Y ), respectively. Find the PDF of Z = X + Y . 5. (20 Points) Let X and Y be two independent random variables, both uniformly distributed over (0 , 1). Use the convolution formula to calculate the PDF of Z = X + Y . 1...
View
Full Document
 Spring '07
 Chamberlain
 Normal Distribution, Probability distribution, Probability theory, probability density function

Click to edit the document details