Find p x 2y 1 y 172 chapter 4 jointly distributed

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ESTIMATION 165 Example 4.10.1 Let X = Y + N , where Y has the exponential distribution with parameter λ, and N is Gaussian with mean 0 and variance σ 2 . Suppose the variables Y and N are independent, and 1 1 the parameters λ and σ 2 are known and strictly positive. (Recall that E [Y ] = λ and Var(Y ) = λ2 .) (a) Find L∗ , the MSE linear estimator of Y given X , and also find the resulting MSE. (b) Find an unconstrained estimator of Y yielding a strictly smaller MSE than L∗ does. Solution: (a) Since Y and N are independent, Cov(Y, N ) = 0. Therefore, Cov(Y, X ) = Cov(Y, Y + N ) = Cov(Y, Y ) + Cov(Y, N ) = Var(Y ) Var(X ) = Var(Y + N ) = Var(Y ) + Var(N ). So, by (4.32), L∗ (X ) = 1 1/λ2 + λ 1/λ2 + σ 2 X− 1 λ = 1 1 + λ 1 + λ2 σ 2 X− 1 λ = X + λσ 2 , 1 + λ2 σ 2 and by (4.34), MSE for L∗ (X ) = 1 − (1/λ4 )/ λ2 1 + σ2 λ2 = σ2 . 1 + λ2 σ 2 (b) Although Y is always nonnegative, the estimator L∗ (X ) can be negative. An estimator with smaller MSE is Y = max{0, L∗ }, because (Y − Y )2 ≤ (Y − L∗...
View Full Document

Ask a homework question - tutors are online