STA3032 Chapter 8 - Chapter 8, Page 1 of 25 Chapter 11...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 8, Page 1 of 25 Chapter 11 Multiple Regression Analysis The basic ideas are the same as in Chapter 7 We have one response (dependent) variable, Y. The response (Y) is a quantitative variable. There are p ( 2) predictors (independent variables) in the model: X 1 , X 2 , , X p . o The predictors can be: s Quantitative (as before) s Categorical (new) s Interaction terms (product of predictors) s Powers of predictors (e.g. 2 4 X ). In this course we will concentrate on o Reading computer output o Interpreting coefficients o Determining the order to interpret things. Some Examples Example 1: Suppose we want to predict temperature for different cities, based on their latitude and elevation. In this case, the response and the predictors are Y = temperature X 1 = Latitude X 2 = Elevation Possible models are With p = 2: 1 1 2 2 y x x = + + + (Stiff surface) With p = 3: 1 1 2 2 3 1 2 y x x x x = + + + + (Twisted surface) Chapter 8, Page 2 of 25 Example 2: We want to predict patients well-being from the dosage of medicine they take (mg.) using a quadratic model: 2 1 2 ( ) y x x = + + + Here X = Dosage of the active ingredient (in mgs), and p = 2. Example 3: Suppose we want to predict Y = the highway mileage of a car using X 1 = its city mileage and X 2 = its size (a categorical variable) where, 2 if car is compact X 1 if car is larger = The model we may use is 1 2 2 2 3 1 2 ( ) y x x x x = + + + + Note that the last term 3 1 2 ( ) x x is for interaction which allows for NON-parallel lines. Chapter 8, Page 3 of 25 The Multiple Linear Regression Model: 1 1 2 2 p p y x x x = + + + + L Assumptions: 1) ~ N(0, ) [ Error terms are iid normal with mean zero and constant standard deviation ]. 2) As a result of this, we have, Y ~ N( Y , ), for every combination of x 1, x 2 , , x p . That is, the response (Y) has a normal distribution with mean Y (that depends on the values of the independent variables, xs) and a constant standard deviation, (that does not depend on the values of Xs). We use data to find the Fitted Equation or Prediction Equation 1 1 2 2 p p y b b x b x b x = + + + L ANOVA F-test: Overall test of goodness of the model Ho: 1 = 2 = 3 = = p = 0 NOTHING GOOD in model Ha: at least one of s 0 SOMETHING IS GOOD. Test Statistic : MSReg F MSE = P-Value from the tables of the F-distribution with df 1 = p = degrees of freedom of MSReg df 2 = n p 1 = degrees of freedom of MSE ANOVA for Multiple Regression Model Source df SS MSE F Regression (Model) p SSReg SSReg MSReg p = MSReg F MSE = Residual (Error) n p 1 SSE 1 SSE MSE n p =-- Total n 1 SST Chapter 8, Page 4 of 25 Testing for Individual s: Computer output from Minitab: Regression Analysis Y vs. X 1 , X 2 , , X p Predictor Coef SE Coef T P Constant b SE(b ) b /SE(b ) ....
View Full Document

This note was uploaded on 04/14/2011 for the course STA 3032 taught by Professor Kyung during the Spring '08 term at University of Florida.

Page1 / 25

STA3032 Chapter 8 - Chapter 8, Page 1 of 25 Chapter 11...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online