Lecture 10 Prof. Arkonac's Slides (Ch 6.7 - 7.3) for ECO 4000

# Lecture 10 Prof. Arkonac's Slides (Ch 6.7 - 7.3) for ECO 4000

This preview shows pages 1–7. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Multiple Regression (cont’) Multicollinearity, Hypothesis Testing in Multiple Regression ECO 4000, Statistical Analysis for Economics and Finance Fall 2010 Lecture 10 Prof: Seyhan Arkonac, PhD 1 Last thing we did last time was; Multicollinearity: (a) perfect multicollinearity: including both male and female in a regression, STATA drops one. (b) imperfect multicollinearity (often this is called as multicollinearity) : occurs when any two or more regressors have high correlation. 2 3 Multicollinearity, Perfect and Imperfect (SW Section 6.7) Some more examples of perfect multicollinearity The example from earlier: you include STR twice. Second example: regress TestScore on a constant, D , and B , where: D i = 1 if STR ≤ 20, = 0 otherwise; B i = 1 if STR >20, = 0 otherwise, so B i = 1 – D i and there is perfect multicollinearity Would there be perfect multicollinearity if the intercept (constant) were somehow dropped (that is, omitted or suppressed) in this regression? This example is a special case of… 4 The dummy variable trap Suppose you have a set of multiple binary (dummy) variables, which are mutually exclusive and exhaustive – that is, there are multiple categories and every observation falls in one and only one category (Freshmen, Sophomores, Juniors, Seniors, Other). If you include all these dummy variables and a constant, you will have perfect multicollinearity – this is sometimes called the dummy variable trap . Why is there perfect multicollinearity here ? Solutions to the dummy variable trap : 1. Omit one of the groups (e.g. Senior), or 2. Omit the intercept What are the implications of (1) or (2) for the interpretation of the coefficients? Perfect Multicollinearity: . reg ahe age female male Source | SS df MS Number of obs = 7986-------------+------------------------------ F( 2, 7983) = 164.89 Model | 24300.9748 2 12150.4874 Prob > F = 0.0000 Residual | 588266.294 7983 73.6898777 R-squared = 0.0397-------------+------------------------------ Adj R-squared = 0.0394 Total | 612567.269 7985 76.7147487 Root MSE = 8.5843------------------------------------------------------------------------------ ahe | Coef. Std. Err. t P>|t| [95% Conf. Interval]-------------+---------------------------------------------------------------- age | .4415421 .0332389 13.28 0.000 .3763852 .5066989 female |-2.346755 .1950323 -12.03 0.000 -2.729069 -1.964441 male | (dropped) _cons | 4.606864 .9990293 4.61 0.000 2.648505 6.565222------------------------------------------------------------------------------ 5 Perfect Multicollinearity with no intercept: . reg ahe age female male, noconstant Source | SS df MS Number of obs = 7986-------------+------------------------------ F( 3, 7983) =10270.68 Model | 2270534.96 3 756844.986 Prob > F = 0.0000 Residual | 588266.294 7983 73.6898777 R-squared = 0.7942-------------+------------------------------ Adj R-squared = 0.7941 Total | 2858801.25 7986 357.976616 Root MSE = 8.5843------------------------------------------------------------------------------...
View Full Document

## This note was uploaded on 05/05/2011 for the course ECON 4000 taught by Professor Arkonac during the Spring '11 term at CUNY Baruch.

### Page1 / 48

Lecture 10 Prof. Arkonac's Slides (Ch 6.7 - 7.3) for ECO 4000

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online