Lecture-20-Multiple Regression

Lecture-20-Multiple Regression - LECTURE 20 Multiple...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
LECTURE 20 Multiple Regression This lecture covers material on multiple regression analysis. The Least Squares method is used to derive the estimated multiple regression equation. The multiple coefficient of determination ( R 2 ) is used to measure the goodness of fit. Testing for Significance is addressed with the F test for overall significance and the t test for individual significance. Microsoft Excel is used for all applications. Read: Chapter 13, Sections 13.1 to 13.5. We have learnt how to compute the estimated simple linear regression equation from sample data by using the least squares method. The variable that was being predicted was known as the dependent variable . The variable that was being used to predict the value of the dependent variable was known as the independent variable . In multiple regression models we still have one dependent variable but two or more independent variables. The concepts and equations that apply to the simple linear regression case can be extended to the multiple regression case. Multiple Regression Model If y is the dependent variable and the independent variables are x 1 , x 2 , x 3 , . .. x p , then we can use multiple regression to develop an equation (i.e., a mathematical model) showing how they are related. The regression model used is of the form: y = B 0 + B 1 x 1 + B 2 x 2 +... + B p x p + e The model indicates that y is a linear function of x 1 , x 2 , x 3 ,.... x p and e. B 0 , B 1 , B 2 ,... B p are parameters (based on the population) to be derived by the model. e is the error term and is a random variable that accounts for any variability in y that cannot be estimated by the model. 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Estimated Multiple Regression Equation We are using sample data to estimate population parameters. The estimated multiple regression equation is: ŷ = b 0 + b 1 x 1 + b 2 x 2 +... + b p x p This is the equation of a straight line where: ŷ = Estimated value of the dependent variable y b 0 , b 1 , b 2 , . ... b p are sample statistics that are point estimates for the parameters B 0 , B 1 , B 2 , . .. B p . b 0 = Value of ŷ when all independent variables are at zero value. b 1 = Change in ŷ for an unit change in x 1 , when all other independent variables are held constant. b 2 = Change in ŷ for an unit change in x 2 , when all other independent variables are held constant. Therefore: b p = Change in ŷ for an unit change in x p , when all other independent variables are held constant. We use the Least Squares Method for determining the estimated regression equation. Once the equation is determined, we can estimate ŷ for different values of the independent variables. 2
Background image of page 2
Least Squares Method This technique is again used for finding the estimated multiple regression equation from sample data. We need a line that best fits the data: such a line should minimize the sum of the
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 09/21/2011 for the course OM 210 taught by Professor Singer during the Summer '08 term at George Mason.

Page1 / 12

Lecture-20-Multiple Regression - LECTURE 20 Multiple...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online