V_viol__(A_to_K)_2011

V_viol__(A_to_K)_2011 - V1 James B. McDonald Brigham Young...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
1 V James B. McDonald Brigham Young University 11/16/11 V. Violations of the Basic Assumptions in the Classical Normal Linear Regression Model A. Introductory Comments, B. Nonnormality of errors, C. Nonzero mean of errors, D. Generalized Regression Model, E. Heteroskedasticity, F. Autocorrelation, G. Panel Data, H. Stochastic X’s, I. Measurement Error, J. Specification Error A. Introductory Comments The Classical Normal Linear Regression Model is defined by: y = Xβ + ε where (A.1) ε is distributed normally (A.2) E(ε t ) = 0 for all t (A.3) Var (ε t ) = σ 2 for all t (A.4) Cov (ε t ε s ) = 0 for t s (A.5) The X's are nonstochastic and n X X lim n is nonsingular, Σ X . Recall that assumptions (A.1) - (A.4) can be written more compactly as ε ~ N[0, = σ 2 I]. In section (II') we demonstrated that under assumptions (A.1)-(A.5) the least squares estimator ( ˆ ), the maximum likelihood estimator ( ), and the best linear unbiased estimator ( ~ ) are identical, i.e., ˆ = ~ = = (X'X) -1 X'y and ˆ ~ N[β; σ 2 (X'X) -1 ]. Additionally, we proved that the least squares estimator ˆ (hence ~ and ) are unbiased estimators minimum variance of all unbiased estimators
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 V consistent asymptotically efficient. In this section we will demonstrate that the statistical properties of ˆ are crucially dependent upon the validity of assumptions (A.1)-(A.5). The associated discussion will proceed by dropping one assumption at a time and considering the consequences. First, we will drop (A.1) and then (A.2). This will be followed by considering the generalized regression model which can be viewed as a generalized model which includes heteroskedasticity (violation of (A.3)), autocorrelation (violation of (A.4)), and the classical normal linear regression model as special cases. In Sections G, H, and I we will consider the implications of violating (A.5), the existence of measurement error, and presence of specification error (guessing the wrong model). To summarize the consequences of violations of the various assumptions, we will show that dropping (A.1), (A.3), and/or (A.4) will result in OLS still being unbiased and consistent, but not efficient (not having the smallest possible variance). However, if (A.5) is violated by the errors being correlated with the X’s, (perhaps because of including endogenous regressors), OLS will be biased and inconsistent. B. The Random Disturbances are not distributed normally, but (A.2)-(A.5) are valid. An inspection of the derivation of the least squares estimator ˆ reveals that the deduction is independent of any of the assumptions (A.1)-(A.5); hence, ˆ = (X'X) -1 X'y is still the correct formula for the least squares estimator of β in the model y= Xβ + ε regardless of the assumptions about the distribution of ε. However, it should be mentioned that the statistical properties of ˆ are very sensitive to the assumptions about the distribution of ε.
Background image of page 2
3 V Similarly, we note that the BLUE of β is invariant with respect to the assumptions about the underlying probability density function of ε as long as (A.2)-(A.5) are valid. In this case we can conclude that ˆ = ~ = (X'X) -1 X'y and both ˆ and ~ will be unbiased
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 77

V_viol__(A_to_K)_2011 - V1 James B. McDonald Brigham Young...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online