This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: STAT5044: Regression and Anova Inyoung Kim Outline 1 Multiple Linear Regression Basic Idea An extra sum of squares the marginal reduction in the error sum of squares when one or several predictor variables are added to the regression model, given that other predictor variables are already in the model. Equivalently, one can view an extra sum of squares as measuring the marginal increase in regression sum of squares when one or several predictor variables are added to the regression model. Example A study of the relation of amount of body fat ( Y ) to several possible predictor variables, based on a sample of 20 healthy females 2534 years old. The possible predictor variables are triceps skinfold thickness ( X 1 ), thigh circumference ( X 2 ), and midarm circumference ( X 3 ). The amount of body fat for each of the 20 persons was obtained by a cumbersome and expensive procedure requiring the immersion of the person in water. It would therefore be very helpful if a regression model with some or all of these predictor variables could provide reliable estimates of the amount of body fat since the measurements needed for the predictor variables are easy to obtain. Example Model Regression SSE Model 1 Reg of Y on X 1 SSE ( X 1 ) =143.12 Model 2 Reg of Y on X 2 SSE ( X 2 ) =113.42 Model 3 Reg of Y on X 1 and X 2 SSE ( X 1 , X 2 ) =109.95 Model 4 Reg of Y on X 1 , X 2 , and X 3 SSE ( X 1 , X 2 , X 3 ) =98.41 SSE ( X 1 , X 2 ) < SSE ( X 1 ) The difference between these two SSEs is called an extra sum of squares and will be denoted by SSR ( X 2  X 1 ) SSR ( X 2  X 1 ) = SSE ( X 1 ) SSE ( X 1 , X 2 ) = 143 . 12 109 . 95 = 33 . 17 This reduction in the error sum of squares is the result of adding X 2 to the regression model when X 1 is already included in the model Example SSE ( X 1 , X 2 ) < SSE ( X 1 ) The difference between these two SSEs is called an extra sum of squares and will be denoted by SSR ( X 2  X 1 ) SSR ( X 2  X 1 ) = SSE ( X 1 ) SSE ( X 1 , X 2 ) = 143 . 12 109 . 95 = 33 . 17 This reduction in the error sum of squares is the result of adding X 2 to the regression model when X 1 is already included in the model Thus, the extra sum of squares SSR ( X 2  X 1 ) measures the marginal effect of adding X 2 to the regression model when X 1 is already in the model....
View
Full
Document
 Fall '11
 Staff
 Linear Regression

Click to edit the document details