Bayesian Estimator - Detection and Estimation (Spring,...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Detection and Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.1 Bayesian Estimators 9 Now we assume we have some prior knowledge about θ . To incorporate it, we assume θ is random variable with a given pdf. 9 Bayesian estimation is useful in situations where an MVU estimator cannot be found. ~ Prior Knowledge and Estimation Example: DC Level in WGN, but –A ≤ A ≤ A instead of - ∞ ≤ A ≤ ∞ . MVU estimator: x A = ˆ Truncated sample mean estimator: A ( is better than A ˆ in terms of MSE. We reduce the mean square error by allowing the estimator to be biased! Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.2 Bayesian Approach: A is considered to be a random variable with a prior pdf. We attempt to estimate the realization of A. e.g., Assume A ~ U [-A ,A ]. Classical: Bayesian: Remark: MSE depends on A. BMSE doesn’t. Since p( x ) ≥ 0 for all x , if the integral in brackets can be minimized for each x , then the Bayesian MSE will be minimized. Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.3 Remarks: 9 The optimal estimator in terms of minimizing the Bayesian MSE is the mean of the posterior pdf p(A| x ). 9 : the prior pdf of A 9 The estimator that minimizes the Bayesian MSE is termed the minimum mean square error (MMSE) estimator . Example: p(A) = U [-A ,A ]. Assume w[n] is independent of A. where Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.4 The MMSE estimator: Cannot be evaluated in closed form. Remarks: 9 A ˆ is biased towards zero unless N A 2 σ >> . In general, A ˆ is biased towards the prior mean. 9 As N increases, the MMSE estimator relies less and less on the prior knowledge and more on the data. 9 Before observation, we assume a prior pdf p( θ ). After observation, our state of knowledge about the parameter is summarized by the posterior pdf p( θ | x ). 9 The choice of a prior pdf is critical in Bayesian estimation. A wrong choice will result in a poor estimator. 9 An optimal estimator is defined to be the one that minimizes the MSE when averaged over all realizations of θ and x . MMSE estimator Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.5 ~ Choosing a Prior PDF 9 Should be based on physical constraints of the problem. 9 Choose prior to allow easy integration. Example: DC Level in WGN – Gaussian Prior pdf Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.6 Let The posterior pdf is also Gaussian, but with different mean and variance. where Remark: α is a weighting factor (0 < α < 1) that compromise between the prior knowledge and the data knowledge. Detection & Estimation (Spring, 2009) Bayesian Estimators NCTU EE P.7 The prior knowledge improves the estimation accuracy....
View Full Document

This note was uploaded on 07/21/2009 for the course EE IEE5703 taught by Professor Sheng-jyhwang during the Spring '09 term at National Chiao Tung University.

Page1 / 37

Bayesian Estimator - Detection and Estimation (Spring,...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online