Isye 2027

# 0025 00018 00013 x 00 10 20 30 40 002 04920 04522

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: = E [XY ], so ∞ ∞ −∞ ∞ −∞ uvfX (u)fY |X (v |u)dvdu ρX,Y = E [XY ] = = ∞ vfY |X (v |u)dv du ufX (u) −∞ ∞ −∞ u2 fX (u)du = ρ. =ρ −∞ This proves (c). If ρ = 0 then ρX,Y = 0 so that X and Y are not independent. If ρ = 0 then fX,Y factors into the product of two single-variable normal pdfs, so X and Y are independent. This proves (d). By (4.33), L∗ (u) = ρu. Therefore, L∗ (u) = g ∗ (u), as claimed, proving (e). By (4.34), the MSE 2 for using L∗ is σe = 1 − ρ2 , so (f) follows from (4.39). 2 Example 4.11.3 Let X and Y be jointly Gaussian random variables with mean zero, σX = 5, 2 = 2, and Cov(X, Y ) = −1. Find P {X + 2Y ≥ 1}. σY 172 CHAPTER 4. JOINTLY DISTRIBUTED RANDOM VARIABLES Solution: Let Z = X + 2Y. Then Z is a linear combination of jointly Gaussian random variables, so Z itself is a Gaussian random variable. Also, E [Z ] = E [X ] + 2E [Y ] = 0 and 2 σZ = Cov(X + 2Y, X + 2Y ) = Cov(X, X ) + Cov(X, 2Y ) + Cov(2Y, X ) + Cov(2Y, 2Y ) 2...
View Full Document

## This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Tech.

Ask a homework question - tutors are online