Econometrics-I-15

Generalized least squares efficient estimation

Info iconThis preview shows pages 14–20. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Generalized least squares - efficient estimation. Assuming weights are known. Two step generalized least squares: p Step 1: Use least squares, then the residuals to estimate the weights. p Step 2: Weighted least squares using the estimated weights. p (Iteration: After step 2, recompute residuals and return to step 1. Exit when coefficient vector stops changing.) &#152;&#152;™™™ ™ 13/45 Part 15: Generalized Regression Applications Autocorrelation The analysis of “autocorrelation” in the narrow sense of correlation of the disturbances across time largely parallels the discussions we’ve already done for the GR model in general and for heteroscedasticity in particular. One difference is that the relatively crisp results for the model of heteroscedasticity are replaced with relatively fuzzy, somewhat imprecise results here. The reason is that it is much more difficult to characterize meaningfully “well behaved” data in a time series context. Thus, for example, in contrast to the sharp result that produces the White robust estimator, the theory underlying the Newey- West robust estimator is somewhat ambiguous in its requirement of a bland statement about “how far one must go back in time until correlation becomes unimportant.” &#152;&#152;™™™ ™ 14/45 Part 15: Generalized Regression Applications The Familiar AR(1) Model t = t-1 + ut , | | < 1 . This characterizes the disturbances, not the regressors. A general characterization of the mechanism producing history + current innovations Analysis of this model in particular. The mean and variance and autocovariance Stationarity. Time series analysis. Implication: The form of 2 ; Var[] vs. Var[u]. Other models for autocorrelation - less frequently used – AR(1) is the workhorse. &#152;&#152;™™™ ™ 15/45 Part 15: Generalized Regression Applications Building the Model p Prior view: A feature of the data n “Account for autocorrelation in the data.” n Different models, different estimators p Contemporary view: Why is there autocorrelation? n What is missing from the model? n Build in appropriate dynamic structures n Autocorrelation should be “built out” of the model n Use robust procedures (Newey-West) instead of elaborate models specifically for the autocorrelation. &#152;&#152;™™™ ™ 16/45 Part 15: Generalized Regression Applications Model Misspecification &#152;&#152;™™™ ™ 17/45 Part 15: Generalized Regression Applications Implications for Least Squares Familiar results: Consistent, unbiased, inefficient, asymptotic normality The inefficiency of least squares: Difficult to characterize generally. It is worst in “low frequency” i.e., long period (year) slowly evolving data....
View Full Document

{[ snackBarMessage ]}

Page14 / 46

Generalized least squares efficient estimation Assuming...

This preview shows document pages 14 - 20. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online