This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Wooldridge, Introductory Econometrics, 4th ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (twovariable) regression model to con sider the possibility that there are additional explanatory factors that have a systematic ef fect on the dependent variable. The simplest extension is the threevariable model, in which a second explanatory variable is added: y = + 1 x 1 + 2 x 2 + u (1) where each of the slope coefficients are now partial derivatives of y with respect to the x variable which they multiply: that is, hold ing x 2 fixed, 1 = y/x 1 . This extension also allows us to consider nonlinear relationships, such as a polynomial in z, where x 1 = z and x 2 = z 2 . Then, the regression is linear in x 1 and x 2 , but nonlinear in z : y/z = 1 +2 2 z. The key assumption for this model, analogous to that which we specified for the simple re gression model, involves the independence of the error process u and both regressors, or ex planatory variables: E ( u  x 1 ,x 2 ) = 0 . (2) This assumption of a zero conditional mean for the error process implies that it does not systematically vary with the x s nor with any linear combination of the x s ; u is independent, in the statistical sense, from the distributions of the x s. The model may now be generalized to the case of k regressors: y = + 1 x 1 + 2 x 2 + ... + k x k + u (3) where the coefficients have the same inter pretation: each is the partial derivative of y with respect to that x, holding all other x s constant ( ceteris paribus ), and the u term is that nonsystematic part of y not linearly re lated to any of the x s. The dependent variable y is taken to be linearly related to the x s, which may bear any relation to each other (e.g. poly nomials or other transformations) as long as there are no exact linear dependencies among the regressors. That is, no x variable can be an exact linear transformation of another, or the regression estimates cannot be calculated. The independence assumption now becomes: E ( u  x 1 ,x 2 ,...,x k ) = 0 . (4) Mechanics and interpretation of OLS Consider first the threevariable model given above in (1). The estimated OLS equation contains the parameters of interest: y = b + b 1 x 1 + b 2 x 2 (5) and we may define the ordinary least squares criterion in terms of the OLS residuals, calcu lated from a sample of size n, from this expres sion: min S = n X i =1 ( y i b b 1 x i 1 b 2 x i 2 ) 2 (6) where the minimization of this expression is performed with respect to each of the three parameters, { b ,b 1 ,b 2 } . In the case of k regres sors, these expressions include terms in b k , and the minimization is performed with respect to the ( k +1) parameters { b ,b 1 ,b 2 ,...b k } . For this to be feasible, n > ( k + 1) : that is, we must have a sample larger than the number of pa rameters to be estimated from that sample....
View
Full
Document
 '08
 ABDUS,S.

Click to edit the document details