lectr12 - 12. Principles of Parameter Estimation The...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in earlier lectures to practical problems of interest. In this context, consider the problem of estimating an unknown parameter of interest from a few of its noisy observations. For example, determining the daily temperature in a city, or the depth of a river at a particular spot, are problems that fall into this category. Observations (measurement) are made on data that contain the desired nonrandom parameter θ and undesired noise. Thus, for example, PILLAI
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 (12-1) noise, part) (desired signal n Observatio + = or, the i th observation can be represented as Here θ represents the unknown nonrandom desired parameter, and represent random variables that may be dependent or independent from observation to observation. Given n observations the estimation problem is to obtain the “best” estimator for the unknown parameter in terms of these observations. Let us denote by the estimator for . Obviously is a function of only the observations. “Best estimator” in what sense? Various optimization strategies can be used to define the term “best”. . , , 2 , 1 , n i n X i i " = + (12-2) , , , , 2 2 1 1 n n x X x X x X = = = " ) ( ˆ X θ ) ( ˆ X θ n i n i , , 2 , 1 , " = PILLAI
Background image of page 2
3 Ideal solution would be when the estimate coincides with the unknown θ . This of course may not be possible, and almost always any estimate will result in an error given by One strategy would be to select the estimator so as to minimize some function of this error - such as - minimization of the mean square error (MMSE), or minimization of the absolute value of the error etc. A more fundamental approach is that of the principle of Maximum Likelihood (ML). The underlying assumption in any estimation problem is (12-3) . ) ( ˆ θ θ− = X e ) ( ˆ X θ ) ( ˆ X θ PILLAI
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 that the available data has something to do with the unknown parameter θ . More precisely, we assume that the joint p.d.f of given by depends on . The method of maximum likelihood assumes that the given sample data set is representative of the population and chooses that value for that most likely caused the observed data to occur, i.e., once observations are given, is a function of alone, and the value of that maximizes the above p.d.f is the most likely value for , and it is chosen as the ML estimate for (Fig. 12.1). n X X X , , , 2 1 " n X X X , , , 2 1 " ) ; , , , ( 2 1 θ n X x x x f " ), ; , , , ( 2 1 θ n X x x x f " n x x x , , , 2 1 " ) ; , , , ( 2 1 θ n X x x x f " ) ( ˆ X ML θ ) ( ˆ X ML θ ) ; , , , ( 2 1 θ n X x x x f " θ Fig. 12.1 PILLAI
Background image of page 4
5 Given the joint p.d.f represents the likelihood function, and the ML estimate can be determined either from the likelihood equation or using the log-likelihood function (sup in (12-4) represents the supremum operation) If is differentiable and a supremum exists in (12-5), then that must satisfy the equation We will illustrate the above procedure through several examples: , , , , 2 2 1 1 n n x X x X x X = = = " ) ; , , , ( 2 1 θ n X x x x f " ) ; , , , ( sup 2 1 ˆ θ θ
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 19

lectr12 - 12. Principles of Parameter Estimation The...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online