Lecture13 - Principles of Parameter Estimation Statistics...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 1 Statistics and Estimation 2 Principles of Parameter Estimation ± Apply earlier in practical problems ± Consider the problem of estimating an unknown parameter of interest from noisy observations of related process ± For example, determining daily temperature in a city, or depth of ocean or location of a target ± Observations (measurements) of data include nonrandom parameter θ of interest and undesired noise noise, part) (desired signal n Observatio + = 3 . , , 2 , 1 , n i n X i i " = + = ) ( ˆ X n i n i , , 2 , 1 , " = Observations and Estimation ± i th observation may be represented as ² Here represents an unknown nonrandom desired parameter ² Estimated parameter designated 4 Problem Statement ± represent random variables which may/may not be dependent ± Given observations , estimation problem is to obtain “best” estimate of unknown parameter ² Denote by the estimator for which is a function of observations . ² “Best estimator” in what sense? Various optimization strategies yield different terms “best”. ,1 , i Xi N = " n () X 5 Solution to Problem ± Ideal solution would be when coincides with unknown . ± This of course may not be possible, and almost always any estimate will result in an error given by ± One strategy would be to select an estimator so as to minimize some function of this error – minimmal mean square error (MMSE), or minimal absolute error ) ( ˆ X . ) ( ˆ = X e ) ( ˆ X 6 MS Estimation ² If we choose g(x) to minimize the MS error, ² The best estimator is 22 {[ ( ) ] } [ ( ) ] ( , ) g eE x x fx d x ϕθ =− = ˆ x θϕ =
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 7 Estimation Solution ± Another systematic approach is by principle of Maximum Likelihood (ML) ± Given , joint pdf is the likelihood function, ± The method of maximum likelihood assumes given sample data set is representative of population ± With observations , is a function of θ alone ± Value of maximizing above pdf is most likely value for , and is chosen as ML estimate , , , , 2 2 1 1 n n x X x X x X = = = " ) ; , , , ( 2 1 n X x x x f " ), ; , , , ( 2 1 n X x x x f " n x x x , , , 2 1 " 8 ML Estimate ² Choose value for that most likely caused the observed data to occur, i.e ± The ML estimate can be determined either from the likelihood equation ) ( ˆ X ML ) ; , , , ( 2 1 n X x x x f " ) ; , , , ( sup 2 1 ˆ n X x x x f ML " 9 ML Estimate ± Or maximizing the Log-Likelihood
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 5

Lecture13 - Principles of Parameter Estimation Statistics...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online