This preview shows page 1. Sign up to view the full content.
Unformatted text preview: . © P. KOLM 40 Efficiency Intuition: We want our estimator to be “closer to the truth on average” than any
other estimator
ˆ Mathematically, we say an unbiased estimator, q1 , is more efficient7 than
ˆ
ˆ
another unbiased estimator, q2 , if the sample variance of q1 is less than that
ˆ
of q . That is,
2 ˆ
ˆ
Var (q1 ) £ Var (q2 ) VER. 9/11/2012. © P. KOLM 41 Mean Squared Error (1/2) Intuition: What if can’t find an unbiased estimator that is “efficient enough”? VER. 9/11/2012. © P. KOLM 42 Mean Squared Error (2/2) ˆ
ˆ Let q be an estimate of q and q * = E (q), but q * ¹ q . We say that
ˆ
ˆ
Bias(q) º q *  q is the bias of q We define the mean squared error (MSE) of an estimator as
ˆ
ˆ
MSE (q) = E (q  q)2 ( ) The MSE describes the tradeoff between bias and efficiency, since8 ˆ
ˆ
ˆ
MSE (q) = Var (q) + Bias(q)2 We call this decomposition the biasvariance tradeoff
o First term: variance of the estimator
o Second term: squared bias of the estimator
o Shrinkage estimators are based on this idea VER. 9/11/2012. © P. KOLM 43 Example: Earlier we looked at two estimators for the population variance. We showed 1n
1n
2
2
ˆ
that s =
å (Yi Y ) is unbiased, whereas sY = n å (Yi Y )2 is biased
n  1 i =1
i =1
n 1 2
ˆ2
downward. In particular, E (sY ) =
s
n However, we pointed out that
2
Y 2
ˆ2
Var (sY ) < Var (sY ) This means that the biased estimator has a smaller variance To compare the two estimators, we calculate the difference in their MSE (For you: Show this.) and obtain
æ 2n  1
2ö
2
2
÷<0
ˆY )  MSE (sY ) = s 4 ç 2 ÷
MSE (s
ç
çn
÷
n  1÷
è
ø VER. 9/11/2012. © P. KOLM 44 Asymptotic Properties Motivation: In practice, it is sometimes hard to show whether an estimator is unbiased or what its sample variance is Instead, we may then try to obtain approximate results about its distribution as the sample becomes large These properties are referred to as asymptotic properties, and answer the question “what happens to the estimator as the sample size goes to infinity?” VER. 9/11/2012. © P. KOLM 45 Asymptotic Property: Consistency
ˆ
Intuition: An estimator is consistent if the values it takes on, say qn , that are not close to the true parameter, q , are increasingly unlikely as the sample size
increases (i.e. n ¥ )
ˆ Formally, we say that an estimator qn of a parameter q is a consistent
ˆ
estimator of q ; if for any e > 0 , P ( qn  q > e) 0 as n ¥ . We write
ˆ
qn ¾¾ q and plim qn = q .9
p
n ¥ VER. 9/11/2012. © P. KOLM 46 Remarks: If an estimator is not consistent, then this estimator does not help us to
learn about q , even with an unlimited amount of data. Therefore,
consistency is a minimal requirement of an estimator used in statistics or
econometrics An unbiased estimator is not necessarily consistent
o Example: Choose the first observation in the sample, Y1 , as an estimate
of the population mean mY . This estimator is unbiased since E (Y1 ) = mY .
But plim(Y1 ) ¹ mY
ˆ
ˆ An unbiased estimator, qn , is consistent if Var (qn ) 0 as n ¥...
View
Full
Document
This document was uploaded on 02/17/2014 for the course COURANT G63.2751.0 at NYU.
 Fall '14

Click to edit the document details