Notes 2 - The Multiple Regression Model Now we know the...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: The Multiple Regression Model Now we know the basics of regression, we can expand our knowledge to construct more realistic models. For example, work experience may be a good predictor for earnings, but it certainly is not the only predictor. Hence, we want to expand our model to include multiple explanatory variables. This is precisely what muli-variate regression does, expands our model from a single explanatory variable into multiple variables. The model specification is Note the coefficients are also linear in x i . ( i does not change when x i changes.) Like our simple linear regression, the multivariate model has a set of assumptions that when met yield the best model estimates. Assumptions of the Multivariate Classical Linear Regression Model A1. The error terms, , are normally distributed. A2. The error has an expected (average) value of zero, i.e. E( )=0. A3. The error terms have a common variance, var( )= 2 for all error terms. A4. Each error term is independent of all other error terms, cov( i , j )=0. A5. Regressors and error terms are independent of each other, cov(x i , i )=0. A6. There is no relationship between explanatory variables, cov(x i , x j )=0. These assumptions are the same as those underlying the simple CLRM, except A6 just says the explanatory variables are independent. To derive the multivariate CLRM, we can use scalar algebra. Define the model as 1 + + + + + = n n i x x x y . . . 2 2 1 1 + + + = 2 2 1 1 x x y i We now seek to minimize the sum of squares. Multiply A by And B by Now, subtract B from A. Similarly, 2 2 2 i x i i x x 2 1 ( 29 ( 29 ( 29 + = =--- = + = =--- = -- = 2 2 2 2 1 1 2 2 2 1 1 2 2 2 1 2 2 1 1 1 2 2 1 1 1 1 2 2 2 1 1 ) 2 ) 2 i i i i i i i i i i i i i i i i i i i x x x y x B x x y x SSE x x x y x A x x y x SSE x x y SSE ( 29 [ ] 2 3 2 2 2 2 1 1 2 1 2 2 2 2 i i i i i i i i i i i x x x x x x y x x y x - =- ( 29 [ ] 2 3 2 2 2 2 1 2 1 2 2 2 1 1 i i i i i i i i i i i x x x x x x y x x y x -- = ( 29 [ ] 2 2 1 2 2 2 1 2 1 1 2 1 2 2 i i i i i i i i i i i x x x x x x y x x y x -- = Now we can reverse back to find the intercept. however, such is a rather cumbersome process and unnecessary. Instead, we opt for linear algebra where multiple variables are included in a simple matrix form. You will see that this matrix representation is easier to manage. The simple CLRM also a special case. Lets make some necessary definitions. Let We can now define our model as y=X + We use this model to minimize the sum of squared errors to obtain parameter estimates....
View Full Document

Page1 / 13

Notes 2 - The Multiple Regression Model Now we know the...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online