This preview shows page 1. Sign up to view the full content.
Unformatted text preview: percent about 365*(0.5)
times, and to decrease by one percent about 365*(0.1) times. That leads to a year end value of
(1.01)182.5 (0.99)36.5 = 4.26.) 162 CHAPTER 4. JOINTLY DISTRIBUTED RANDOM VARIABLES 4.10 Minimum mean square error estimation 4.10.1 Constant estimators Let Y be a random variable with some known distribution. Suppose Y is not observed but that
we wish to estimate Y . If we use a constant δ to estimate Y , the estimation error will be Y − δ .
The mean square error (MSE) for estimating Y by δ is deﬁned by E [(Y − δ )2 ]. By LOTUS, if Y is
a continuoustype random variable,
∞ (y − δ )2 fY (y )dy. MSE (for estimation of Y by a constant δ ) = (4.25) −∞ We seek to ﬁnd δ to minimize the MSE. Since (Y − δ )2 = Y 2 − 2δY + δ 2 , we can use linearity of
expectation to get E [(Y − δ )2 ] = E [Y 2 ] − 2δE [Y ] + δ 2 . This is quadratic in δ , and the derivative
with resect to δ is −2E [Y ] + 2δ. Therefore the minimum occurs at δ ∗ = E [Y ]. For this value of δ ,
the MSE is E [(Y − δ ∗ )2 ] = Var(Y ). In summary,...
View
Full
Document
This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.
 Spring '08
 Zahrn
 The Land

Click to edit the document details