Lecture 11_multregII-2012

Lecture 11_multregII-2012 - Lecture 11: Multiple Regression...

Info iconThis preview shows pages 1–7. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Lecture 11: Multiple Regression II Stat 102 2012 R 2 as a Measure of Model Fit. Read Chapter 4.3 Sums of Squares and Effect Tests. Read Chapter 4.4 & Sums of Squares SST, SSE and SSR & ANOVA Table & R 2 & F-statistic and the F-test for ALL coefficients together & Effect Test Table and Tests for Individual Coefficients Example : Gasoline Mileage for New Cars [cont]
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 SST and SSE In multiple regression, this works the same general way as in ordinary regression. The Total Sum of Squares has the same formula: ( ) 2 1 n i i SST Y Y = = . The Sum of Squares for Error has the same sort of formula: ( ) 2 1 ˆ n ii i SSE Y Y = = . & The only difference here is that the formula for the prediction of the conditional mean involves all the x-variables – ˆ Y i ! b 0 + b 1 i x 1 i + .. + b Ki x Ki .
Background image of page 2
3 Sum of Squares for Regression The decomposition of the sums of squares also works just like in ordinary regression. The S um of S quares for R egression is called SSR in Dielman , and S um of S quares for the M odel in JMP. The basic decomposition is SST = SSR+SSE . Hence, from SST and SSE you can find SSR as SSR = SST-SSE These sums of squares appear in the “Analysis of Variance Table” (aka ANOVA table)
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 ANOVA Table Here is the JMP ANOVA table for our data Analysis of Variance Source DF Sum of Squares Mean Square F Ratio Model K 4 SSR 2291.2 SSR/DFR 573 MSR / MSE 210 Error n - K -1 217 SSE 589.9 SSE/DFE 2.72 Prob > F C. Total n- 1 221 SST 2881.1 <.0001 Note the D egrees of F reedom: DFR = K , DFE = n-K- 1, DFT otal = n- 1. Note the M ean S quares: “Mean Square for Model” = MSR = SSR / DFR 2 e s = MSE = SSE / DFE. Finally NOTE: F = MSR / MSE.
Background image of page 4
5 The “Coefficient of Determination”, R 2 Relative to the terms in the ANOVA table this also has the same definition and interpretation as in ordinary regression: 2 SSR SST SSE R SST SST == . THUS, R 2 is the proportion of squared variability accounted for by the linear regression using all the x- variables relative to squared variability of the best fit that does not use any of the x-variables. One difference between the situation here and in ordinary regression: In ordinary regression, R 2 is the square of a sample regression coefficient – There’s no such interpretation here !
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
6 R 2 appears in the JMP Summary of Fit Table: Summary of Fit RSquare R 2 0.795 RSquare Adj 0.791 Root Mean Square Error s e 1.65 Mean of Response Y 19.40 Observations n 222 See Dielman for definition of “Adjusted R 2 ”. We don’t find this concept particularly useful, and so will always ignore it. Always 2 01 R ≤≤ . & R 2 = 0 means the linear regression equation is useless as a means of predicting the Y i .
Background image of page 6
Image of page 7
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 21

Lecture 11_multregII-2012 - Lecture 11: Multiple Regression...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online