DSC 203 Lecture Notes - Chapter 14

DSC 203 Lecture Notes - Chapter 14 - 1 Chapter 14: Multiple...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 Chapter 14: Multiple Regression We employ more than one independent (predictor) variable in the prediction equation. Section 14.1: The Multiple Regression Model and the Least Squares Estimates Example: Tasty Sub Shop Case y = yearly revenue (thousands of dollars) x 1 = population size (thousands of residents) x 2 = business rating (a measure of business activity in the area) This is measured on a scale from 1 to 10 1: limited business and shopping activity nearby 10: lots of business and shopping activity nearby Goal: Predict y using both x 1 and x 2 See Table 14.1 (on page 581 ) and Figures 14.1 and 14.2 ( on page 582) 2 Independent (predictor) Dependent v ariables variable Linear Relationship Linear Relationship y = + 1 x 1 + y = + 1 x 2 + where 1 > 0 where 1 > 3 Combined Model error term ( effects of all other factors on y ) effects of x 1 and x 2 on y are regression parameters relating y to and See Figure 14.3 on page 583 for a geometrical interpretation. 4 We will now use MINITAB to compute the least squares estimates of the regression parameters. See Figure 14.4(b) on page 584. Least squares estimates b , b 1 , and b 2 Prediction equation The least squares prediction equation : y = 125.29 + 14.1996 x 1 + 22.811 x 2 b b 1 b 2 5 The point prediction of y when and , which equals = 956.6 (that is, $956,600) We can understand the meaning of least squares by considering Table 14.2 on page 585. The least squares estimates minimize SSE = the sum of squared errors 6 Notation: Remember: , 1 , 2 (betas) : are u nknown regression parameters b , b 1 , b 2 (bees) : are least squares estimates (aka the regression coefficients ) Interpretation of the parameters : = 2 2 1 1 x x + + is the mean yearly revenue when population is x 1 and the business rating is x 2 . 1. s interpretation Let x 1 = 0 and x 2 = 0 Then = + 1 (0) + 2 (0) = Therefore, is the mean yearly revenue when x 1 = 0 and x 2 = 0. This does not make any sense thats ok, the y- intercept often makes no sense on its own. 2. 1 s interpretation Consider a site where x 1 = c and x 2 = d Then 7 = + 1 (c) + 2 (d) Consider a different site where x 1 = (c + 1) and x 2 = d Then = 0 + 1 (c+1) + 2 (d) The difference between the mean yearly revenue for the two sites is 1 Therefore, 1 = the change in the mean yearly revenue (mean y ) associated with a 1,000 resident increase in population size (that is, a 1 unit increase in x 1 ) when x 2 remains constant. 2 is interpreted similarly The General Multiple Regression Model y = the dependent variable x 1 , x 2 , x 3 , , x p = are p independent (predictor) variables The general multiple regression model says that y = + + + + + + p p x x x x .......
View Full Document

Page1 / 35

DSC 203 Lecture Notes - Chapter 14 - 1 Chapter 14: Multiple...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online