1
1
Statistics and Estimation
2
Principles of Parameter Estimation
±
Apply earlier in practical problems
±
Consider the problem of estimating an unknown
parameter of interest from noisy observations of related
process
±
For example, determining
daily temperature in a city,
or
depth of ocean or location of a target
±
Observations (measurements) of data include
nonrandom parameter
θ
of interest
and undesired noise
noise,
part)
(desired
signal
n
Observatio
+
=
3
.
,
,
2
,
1
,
n
i
n
X
i
i
"
=
+
=
)
(
ˆ
X
n
i
n
i
,
,
2
,
1
,
"
=
Observations and Estimation
±
i
th observation may be represented as
²
Here
represents an unknown nonrandom
desired parameter
²
Estimated parameter designated
4
Problem Statement
±
represent random variables which
may/may not be dependent
±
Given
observations , estimation problem is to
obtain
“best” estimate of
unknown parameter
²
Denote by
the estimator for
which
is a function of
observations
.
²
“Best estimator” in what sense? Various
optimization strategies yield different
terms
“best”.
,1
,
i
Xi
N
=
"
n
()
X
5
Solution to Problem
±
Ideal solution would be when
coincides with
unknown
.
±
This of course may not be possible, and almost
always any estimate will result in an error given
by
±
One strategy would be to select an estimator
so as to minimize some function of this error –
minimmal mean square error (MMSE), or
minimal
absolute error
)
(
ˆ
X
.
)
(
ˆ
−
=
X
e
)
(
ˆ
X
6
MS Estimation
²
If we choose g(x) to minimize the MS error,
²
The best estimator is
22
{[ ( )
] }
[ ( )
]
( , )
g
eE x
x
fx d
x
ϕθ
=−
=
−
∫
ˆ
x
θϕ
=
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document2
7
Estimation Solution
±
Another systematic
approach is by
principle of
Maximum Likelihood
(ML)
±
Given
,
joint
pdf
is the likelihood function,
±
The method of maximum likelihood assumes
given sample data set is representative of
population
±
With observations
,
is a function of
θ
alone
±
Value of
maximizing above
pdf
is most likely
value for
, and is chosen as ML estimate
,
,
,
,
2
2
1
1
n
n
x
X
x
X
x
X
=
=
=
"
)
;
,
,
,
(
2
1
n
X
x
x
x
f
"
),
;
,
,
,
(
2
1
n
X
x
x
x
f
"
n
x
x
x
,
,
,
2
1
"
8
ML Estimate
²
Choose
value for
that most likely caused the
observed data to occur,
i.e
±
The ML estimate can be determined either from
the likelihood equation
)
(
ˆ
X
ML
)
;
,
,
,
(
2
1
n
X
x
x
x
f
"
)
;
,
,
,
(
sup
2
1
ˆ
n
X
x
x
x
f
ML
"
9
ML Estimate
±
Or maximizing the LogLikelihood
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '08
 Krim
 Maximum likelihood, Estimation theory, X1, θ, Xn

Click to edit the document details