SLN Multiple Regression Models Spring 2007

SLN Multiple Regression Models Spring 2007 - Multiple...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Multiple Regression Models Supplementary Lecture Notes // Spring 2007 Multiple regression models are equations with multiple regressors. In other words models where the mean value of the dependent variable Y is a function of more than one independent X variable. Consider a model equation with 3 independent variables, X 1 , X 2 , X 3 . E(Y| X 1 = x 1 , X 2 = x 2 , X 3 = x 3 ) = β 0 + β 1 X 1 + β 2 X 2 + β 3 X 3 This relationship can also be expressed in term of any random observation of Y. Y i = β 0 + β 1 X 1i + β 2 X 2i + β 3 X 3i + u i where Y is the dependent variable X 1 , X 2 , X 3 are the independent or explanatory variables.(aka regressor variables) u is the random error term (as usual, it represents a measure of all other factors or random forces that may have an influence on the dependent variable Y). We assume u ~ N( 0 , σ 2 ) β 0 is the intercept term = the mean value of Y when X 1 = 0, X 2 = 0, X 3 = 0 All the other betas are called Partial Slope Coefficients . β 1 = the expected change in Y per unit change in X 1 , holding X 2 and X 3 constant. β 2 = the expected change in Y per unit change in X 2 , holding X 1 and X 3 constant. β 3 = the expected change in Y per unit change in X 3 , holding X 2 and X 1 constant. In general, β j measures the rate of change in E(Y) per one unit change in X j , holding all other X variables in the model constant, i.e., all else equal or ceteris paribus. In other words, β j is a measure of the partial or marginal effect of Xj on the mean value of Y. () j jj EY XX β ∆∂ == Residuals: As usual, the ith residual is uY ˆ ˆ ii =− i Y 3 i X where YX 01 1 2 23 ˆˆ ˆ ˆ ˆ i X ββ =+ + + and ˆ j is the Least Squares estimate/estimator for j The model is estimated by obtaining estimates for the beta parameters using the method of Least Squares (LS). Recall this method chooses the betas that minimize SSR for a given sample of observations. Solving the minimization problem (which involves differential calculus) results in formulas (estimators) for each beta. The formulas change as the more X variables are added to the model. You don’t need to know the LS formulas of the beta coefficients for estimating multiple regression models. Multiple Regression models are usually estimated with the aid of a computer and statistical software. Important Note: The formulas for the intercept 0 ˆ and slope parameter 1 ˆ in multiple regression models are not the same as the formulas you have for estimating simple regression models. 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Illustrative Examples EXAMPLE I: Average Starting Pay for MBA graduates Model 1: E(ASP) = β 0 + β 1 GMAT where ASP = average starting pay ($) GMAT= test score form GMAT (points). 1 () EA SP GMAT β = = the expected change in ASP per unit change in GMAT score. Results from estimating Model 1 using EViews software. Dependent Variable: ASP Method: Least Squares Date: 04/01/04 Sample: 1 30 Included observations: 30 Variable Coefficient Std. Error t-Statistic Prob.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/13/2008 for the course ECON 322 taught by Professor Francisco during the Spring '07 term at Rutgers.

Page1 / 15

SLN Multiple Regression Models Spring 2007 - Multiple...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online