501_Lecture_11

# 501_Lecture_11 - Section 11.1 Multiple Linear...

This preview shows pages 1–10. Sign up to view the full content.

Section 11.1 Multiple Linear Regression (MLR)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Topics—MLR Extension of SLR Statistical model Estimation of the parameters and interpretation R-square with MLR Anova Table F-Test and t-tests
A continuation of Chapter 10 Most things are conceptually similar to SLR and an extension of what we learned thru chapters 2 and 10. However, most things get much more complex, including the SAS output and learning to interpret it . Lastly, whereas before there usually was a set procedure to analyze the data, we now will have to be more flexible and take things as they come, so to speak.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
A case study: We are interested in finding variables to predict college GPA. Grades from high school will be used as potential explanatory variables (also called predictors) namely: HSM (math grades), HSS (science grades), and HSE (english grades). Since there are several explanatory variables or x’s, they need to be distinguished using subscripts: X 1 =HSM X 2 =HSS X 3 =HSE
Several Simple Linear Regressions? Why not do several Simple Linear Regressions Do GPA with HSM Significant? Do GPA with HSS Significant? Do GPA with HSE Significant? Why not? Each alone may not explain GPA very well at all but used together they may explain GPA quite well. Predictors could (and usually do) overlap some, so we’d like to distinguish this overlap (and remove it) if possible.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The scatterplots with the MLR line: Unfortunately because scatterplots are restricted to only 2 axes (Y-axis and X-axis), they are less useful here. Can plot Y with each predictor seperately, like an SLR, but this is just a preliminary look at each of the variables and cannot tell us whether we have a good MLR or not.
The statistical model for simple linear regression: The deviations ε i are assumed to be independent and N(0, σ ). The parameters of the model are: β 0 , β 1 , β 2, β 3, and σ . Estimates then become b 0 , b 1 , b 2 , b 3 , and 0 1 1 2 2 3 3 i i i i i y x x x β ε = + + + + MSE

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Interpretation of estimates b 0 is still the intercept b 1 is the estimated “slope” for β 1, it explains how y changes as x 1 changes Suppose b 1 = 0.7, then if I change x 1 by 1 point, y changes by 0.7, etc The exact same interpretation as in SLR Then what about b 2 ? b 3 ?
Other things that are the same Predicted values Given values for x 1 , x 2 , and x 3 , plug those into the regression equation and get a Residuals Still Observed – Predicted = y –

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 38

501_Lecture_11 - Section 11.1 Multiple Linear...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online