Notes 2

# Notes 2 - The Multiple Regression Model Now we know the...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: The Multiple Regression Model Now we know the basics of regression, we can expand our knowledge to construct more realistic models. For example, work experience may be a good predictor for earnings, but it certainly is not the only predictor. Hence, we want to expand our model to include multiple explanatory variables. This is precisely what muli-variate regression does, expands our model from a single explanatory variable into multiple variables. The model specification is Note the coefficients are also linear in x i . ( β i does not change when x i changes.) Like our simple linear regression, the multivariate model has a set of assumptions that when met yield the “best” model estimates. Assumptions of the Multivariate Classical Linear Regression Model A1. The error terms, ε , are normally distributed. A2. The error has an expected (average) value of zero, i.e. E( ε )=0. A3. The error terms have a common variance, var( ε )= σ 2 for all error terms. A4. Each error term is independent of all other error terms, cov( ε i , ε j )=0. A5. Regressors and error terms are independent of each other, cov(x i , ε i )=0. A6. There is no relationship between explanatory variables, cov(x i , x j )=0. These assumptions are the same as those underlying the simple CLRM, except A6 just says the explanatory variables are independent. To derive the multivariate CLRM, we can use scalar algebra. Define the model as 1 ε β β β α + + + + + = n n i x x x y . . . 2 2 1 1 ε β β α + + + = 2 2 1 1 x x y i We now seek to minimize the sum of squares. Multiply A by And B by Now, subtract B’ from A’. Similarly, 2 ∑ 2 2 i x ∑ i i x x 2 1 ( 29 ( 29 ( 29 ∑ ∑ ∑ ∑ ∑ ∑ ∑ ∑ ∑ + = =--- = ∂ ∂ ⇒ + = =--- = ∂ ∂ ⇒-- = 2 2 2 2 1 1 2 2 2 1 1 2 2 2 1 2 2 1 1 1 2 2 1 1 1 1 2 2 2 1 1 ) 2 ) 2 i i i i i i i i i i i i i i i i i i i x x x y x B x x y x SSE x x x y x A x x y x SSE x x y SSE β β β β β β β β β β β β ( 29 [ ] 2 3 2 2 2 2 1 1 2 1 2 2 2 2 ˆ i i i i i i i i i i i x x x x x x y x x y x ∑ ∑ ∑ ∑ ∑ ∑ ∑- =- ⇒ β ( 29 [ ] 2 3 2 2 2 2 1 2 1 2 2 2 1 1 ˆ i i i i i i i i i i i x x x x x x y x x y x ∑ ∑ ∑ ∑ ∑ ∑ ∑-- = ⇒ β ( 29 [ ] 2 2 1 2 2 2 1 2 1 1 2 1 2 2 ˆ i i i i i i i i i i i x x x x x x y x x y x ∑ ∑ ∑ ∑ ∑ ∑ ∑-- = ⇒ β Now we can reverse back to find the intercept. however, such is a rather cumbersome process and unnecessary. Instead, we opt for linear algebra where multiple variables are included in a simple matrix form. You will see that this matrix representation is easier to manage. The simple CLRM also a special case. Let’s make some necessary definitions. Let We can now define our model as y=X β + ε We use this model to minimize the sum of squared errors to obtain parameter estimates....
View Full Document

{[ snackBarMessage ]}

### Page1 / 13

Notes 2 - The Multiple Regression Model Now we know the...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online