0054 00040 00030 00022 00016 00011 04 03445783

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 2 = σX + 4Cov(X, Y ) + 4σY = 5 − 4 + 8 = 9. Thus, P {X + 2Y ≥ 1} = P {Z ≥ 1} = P Z 3 ≥ 1 3 =Q 1 3 = 1 − Φ( 1 ) ≈ 0.3694. 3 Example 4.11.4 Let X and Y be jointly Gaussian random variables with mean zero, variance one, and Cov(X, Y ) = ρ. Find E [Y 2 |X ], the best estimator of Y 2 given X. (Hint: X and Y 2 are not jointly Gaussian. But you know the conditional distribution of Y given X = u and can use it to find the conditional second moment of Y given X = u.) Solution: Recall the fact that E [Z 2 ] = E [Z ]2 + Var(Z ) for a random variable Z . The idea is to apply the fact to the conditional distribution of Y given X . Given X = u, the conditional distribution of Y is Gaussian with mean ρu and variance 1 − ρ2 . Thus, E [Y 2 |X = u] = (ρu)2 +1 − ρ2 . Therefore, E [Y 2 |X ] = (ρX )2 + 1 − ρ2 . Example 4.11.5 Suppose X and Y are zero-mean unit-variance jointly Gaussian random variables with correlation coefficient ρ = 0.5. (a) Find Var(3X − 2Y ). (b) Find th...
View Full Document

This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.

Ask a homework question - tutors are online