ProblemSet4 Answers

ProblemSet4 Answers - Professor Mumford Econ 360 Spring...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Professor Mumford Econ 360 - Spring 2010 [email protected] Problem Set 4 Answers True/False 1. FALSE Heteroskedasticity does not causes the OLS estimator to be biased. To show that the OLS estimator is unbiased, we need only assumptions MLR.1 - MLR.4. As- sumption MLR.5 (Homoskedasticity) is not needed. Because this assumption plays no role in showing that the OLS estimator is unbiased, we know that even if the error term is heteroskedastic, the OLS estimator is still unbiased. 2. TRUE A violation of the zero conditional mean assumption would cause the OLS estimator to be biased. Assumption MLR.4 is required to show that OLS is unbiased. 3. TRUE Omitting an important variable that is correlated with the regressor of interest would cause the OLS estimator to be biased. Omitting an important variable that is correlated with the included explanatory variables would violate assumption MLR.4 which is necessary to show that OLS is unbiased. 4. FALSE Including an irrelevant variable that is correlated with the regressor of interest would not cause the OLS estimator to be biased. However, it does increase the vari- ance. Including an irrelevant variable does not violate any of the assumption MLR.1 - MLR.4 and thus we can still show that OLS is unbiased. 5. FALSE A sample correlation coefficient of 0.95 between the regressor of interest and another regressor in the model is called multicollinearity and would not cause the OLS estimator to be biased. The degree of collinearity between the explanatory variables in the sample, even if it is reflected in a correlation as high as .95, does not violate any of the MLR.1 - MLR.4 assumptions and thus does not introduce bias. Only if there is a perfect linear relationship among two or more explanatory variables is assumption MLR.3 violated. 1 Long Answer Questions 6. Gauss-Markov Assumptions (a) MLR.1 (Linear in Parameters) The model in the population can be written as y = β + β 1 x 1 + . . . + β k x k + u. Violation: the population model is not linear in the parameters. For example, the population model y = β + x ( β 1 ) 1 + u is not linear in the parameters. (b) MLR.2 (Random Sampling) We have a random sample of n observations { ( x i 1 , x i 2 , . . . , x ik , y i ) : i = 1 , 2 , . . . , n } from the population. Violation: we have a nonrandom sample from the population. For example, a racially-balanced sample of individuals with one quarter black non-Hispanic, one quarter white non-Hispanic, one quarter Hispanic, and one quarter Asian, is not a random US sample. (c) MLR.3 (No Perfect Collinearity) In the sample, none of the independent variables is constant and there are no exact linear relationships among the independent variables....
View Full Document

This note was uploaded on 02/06/2012 for the course ECON 360 taught by Professor Na during the Spring '10 term at Purdue.

Page1 / 8

ProblemSet4 Answers - Professor Mumford Econ 360 Spring...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online