[RP]Lecture Note VIII

# [RP]Lecture Note VIII - Kyung Hee University Department of...

This preview shows pages 1–4. Sign up to view the full content.

Kyung Hee University Random Processing Department of Electronics and Radio Engineering Prof. Hyundong Shin Communications and Coding Theory Laboratory (CCTLAB) VIII-1 C1002900 RP Lecture Handout VIII: Parameter Estimation Reading: Chapter 8.1-8.3 In a statistical investigation, there are two general classes of problems: (i) The probabilistic model is known, we wish to make predictions concerning fu- ture observations.  ; X F x x x For example, we know the distribution X F x of a random variable X and we wish to predict the average x of its n future samples. (ii) One or more parameters i of the model are unknown and we wish either to estimate their values (parameter estimation) or to decide whether i is a set of known constant i 0 (hypothesis testing). ; X F x i x ˆ ˆ For example, we observe the values i x of X and wish to estimate its mean X (nonrandom but unknown) or to decide whether to accept the hypothesis that X  5 . Let   T n XX X X 12 denote n random variables representing observations x   T n xx x .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Kyung Hee University Random Processing Department of Electronics and Radio Engineering Prof. Hyundong Shin Communications and Coding Theory Laboratory (CCTLAB) VIII-2 We are interested in estimating an unknown nonrandom parameter based on this data. The joint pdf  ; f X x of X depends on . We form an estimate for the unknown parameter as function of these ob- served samples ˆ g  x . (VIII.1) valid form of g x : not function of Estimator: ˆ : statistic g  X (VIII.2) 8.1. Bias and Error Variance Estimator error:   ˆ e   XX . (VIII.3) The bias in an estimator ˆ is defined as       ˆ ˆ ˆ . be X X X (VIII.4) The error variance:       ˆˆ . eb       2 2 X (VIII.5) ˆ b and ˆ  2 are in general functions of the parameter . We say an estimator ˆ for a nonrandom parameter is unbiased if ˆ b 0 for all possible values of :   ˆ X . If ˆ X 1 and ˆ X 2 are both unbiased estimators for , it is obvious that we want the one with lower variance.
Kyung Hee University Random Processing Department of Electronics and Radio Engineering Prof. Hyundong Shin Communications and Coding Theory Laboratory (CCTLAB) VIII-3 How does one find an unbiased estimator for with the minimum possible va- riance? Is it possible to express the lower bound to the variance of all unbiased estimators? 8.2. Minimum-Variance Unbiased (MVU) Estimator Let denote the set of all estimators that are unbiased:    ˆ ˆˆ is valid such that . b   0 (VIII.6) Then, when it exists, the MVU estimator for is defined as   ˆ MVU ˆ ˆ argmin   2 ,  . (VIII.7) MVU ˆ may not exist. For some problems, the set is empty, there are no un- biased estimators. In other case,

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 21

[RP]Lecture Note VIII - Kyung Hee University Department of...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online