Unformatted text preview: ts are placed on the function g.
Suppose you observe X = 10. What do you know about Y ? Well, if you know the joint pdf
of X and Y , you also know or can derive the conditional pdf of Y given X = 10, denoted by
fY |X (v |10). Based on the fact, discussed above, that the minimum MSE constant estimator for a
random variable is its mean, it makes sense to estimate Y by the conditional mean:
∞ E [Y |X = 10] = vfY |X (v |10)dv.
−∞ The resulting conditional MSE is the variance of Y , computed using the conditional distribution
of Y given X = 10.
E [(Y − E [Y |X = 10])2 |X = 10] = E [Y 2 |X = 10] − (E [Y |X = 10]2 ). 4.10. MINIMUM MEAN SQUARE ERROR ESTIMATION 163 Conditional expectation indeed gives the optimal estimator, as we show now. Recall that
fX,Y (u, v ) = fX (u)fY |X (v |u). So
MSE = E [(Y − g (X ))2 ]
∞ ∞ (v − g (u))2 fY |X (v |u)dv fX (u)du. =
−∞ (4.26) −∞ For each u ﬁxed, the integral in parentheses in (4.26) has the same form as the integral (4.25).
Therefore, for each u, the integral in parentheses in (4.26) is minimized by...
View Full Document
This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Tech.
- Spring '08
- The Land