{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

chap3 - Wooldridge Introductory Econometrics 4th ed Chapter...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Wooldridge, Introductory Econometrics, 4th ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to con- sider the possibility that there are additional explanatory factors that have a systematic ef- fect on the dependent variable. The simplest extension is the “three-variable” model, in which a second explanatory variable is added: y = β 0 + β 1 x 1 + β 2 x 2 + u (1) where each of the slope coefficients are now partial derivatives of y with respect to the x variable which they multiply: that is, hold- ing x 2 fixed, β 1 = ∂y/∂x 1 . This extension also allows us to consider nonlinear relationships, such as a polynomial in z, where x 1 = z and
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
x 2 = z 2 . Then, the regression is linear in x 1 and x 2 , but nonlinear in z : ∂y/∂z = β 1 + 2 β 2 z. The key assumption for this model, analogous to that which we specified for the simple re- gression model, involves the independence of the error process u and both regressors, or ex- planatory variables: E ( u | x 1 , x 2 ) = 0 . (2) This assumption of a zero conditional mean for the error process implies that it does not systematically vary with the x 0 s nor with any linear combination of the x 0 s ; u is independent, in the statistical sense, from the distributions of the x 0 s. The model may now be generalized to the case of k regressors: y = β 0 + β 1 x 1 + β 2 x 2 + ... + β k x k + u (3) where the β coefficients have the same inter- pretation: each is the partial derivative of y
Background image of page 2
with respect to that x, holding all other x 0 s constant ( ceteris paribus ), and the u term is that nonsystematic part of y not linearly re- lated to any of the x 0 s. The dependent variable y is taken to be linearly related to the x 0 s, which may bear any relation to each other (e.g. poly- nomials or other transformations) as long as there are no exact linear dependencies among the regressors. That is, no x variable can be an exact linear transformation of another, or the regression estimates cannot be calculated. The independence assumption now becomes: E ( u | x 1 , x 2 , ..., x k ) = 0 . (4) Mechanics and interpretation of OLS Consider first the “three-variable model” given above in (1). The estimated OLS equation contains the parameters of interest: ˆ y = b 0 + b 1 x 1 + b 2 x 2 (5)
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
and we may define the ordinary least squares criterion in terms of the OLS residuals, calcu- lated from a sample of size n, from this expres- sion: min S = n X i =1 ( y i - b 0 - b 1 x i 1 - b 2 x i 2 ) 2 (6) where the minimization of this expression is performed with respect to each of the three parameters, { b 0 , b 1 , b 2 } . In the case of k regres- sors, these expressions include terms in b k , and the minimization is performed with respect to the ( k +1) parameters { b 0 , b 1 , b 2 , ...b k } . For this to be feasible, n > ( k + 1) : that is, we must have a sample larger than the number of pa- rameters to be estimated from that sample.
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}