1
Mean Square Estimation
•
Given
as a
sequence of
observed
random Variables
•
Y
represents an unknown random variable to be
estimated
in terms of observations
•
Note that
can be a linear or a nonlinear function
of
represents the error in the above
estimate
,
•
is a quadratic error
n
X
X
X
,
,
,
2
1
"
)
(
⋅
ϕ
.
,
,
,
2
1
n
X
X
X
"
)
(
ˆ
)
(
X
Y
Y
Y
X
ε
−
=
−
=
2


•
represents the mean square error
•
One strategy to obtain a good estimator would be to
minimize the mean square error by varying
•
This procedure gives rise to the M
inimization of the
M
ean S
quare E
rror (MMSE) criterion for estimation
.
Theorem1: Under MMSE criterion, the best estimator
for the unknown
Y in terms of
is given
by the conditional mean of Y
given
X
.
Proof :
Let
represent an estimate of
Y
in terms of
Error
is given by
}


{
2
E
),
(
⋅
n
X
X
X
,
,
,
2
1
"
}.

{
)
(
ˆ
X
Y
E
X
Y
=
=
)
(
ˆ
X
Y
=
).
,
,
,
(
2
1
n
X
X
X
X
"
=
,
ˆ
Y
Y
−
=
}

)
(

{
}

ˆ

{
}


{
2
2
2
2
X
Y
E
Y
Y
E
E
σ
−
=
−
=
=
•
Using
we can rewrite
Note:
the inner expectation is with respect to
Y
, and
the outer one is
with respect to
¾
Thus
•
To obtain the best estimator
we need to minimize
with respect to
•
Since
and the variable
appears in
integrand
minimizing
w.r.t
is
equivalent to
min
w.r.t
}]

{
[
]
[
X
z
E
E
z
E
z
X
=
}]

)
(

{
[
}

)
(

{
z
2
z
2
2
X
X
Y
E
E
X
Y
E
Y
X
±²
±³
´
´
−
=
−
=
.
X
22
2
X
E[E{ Y
(x)  X }]
E{ Y
(X)  X }f (X)dx.
ε
+∞
−∞
σ=
−ϕ
=−
ϕ
∫
,
2
.
,
0
)
(
≥
X
f
X
,
0
}

)
(

{
2
≥
−
X
X
Y
E
2
}

)
(

{
2
X
X
Y
E
−
.
•
X
being
fixed at some value,
is
no longer random,
and hence minimization of
is equivalent
to
This gives
or
But
Since when
is a fixed number
)
(
X
}

)
(

{
2
X
X
Y
E
−
.
0
}

)
(

{
2
=
−
∂
∂
X
X
Y
E
0
}

)
(
{
=
−
X
X
Y
E
),
(
}

)
(
{
X
X
X
E
=
)
(
,
X
x
X
=
).
(
x
.
0
}

)
(
{
}

{
=
−
X
X
E
X
Y
E