{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

STA3032 Chapter 8 MLR

# STA3032 Chapter 8 MLR - Chapter 11 Multiple Regression...

This preview shows pages 1–5. Sign up to view the full content.

Chapter 11 Multiple Regression Analysis The basic ideas are the same as in Chapter 7 We have one response (dependent) variable, Y. The response (Y) is a quantitative variable. There are p (≥ 2) predictors (independent variables) in the model: X 1 , X 2 , …, X p . o The predictors can be: Quantitative (as before) Categorical (new) Interaction terms (product of predictors) Powers of predictors (e.g. 2 4 X ). In this course we will concentrate on o Reading computer output o Interpreting coefficients o Determining the order to interpret things. Some Examples Example – 1: Suppose we want to predict temperature for different cities, based on their latitude and elevation. In this case, the response and the predictors are Y = temperature X 1 = Latitude X 2 = Elevation Possible models are With p = 2: 0 1 1 2 2 y x x β β β ε = + + + (Stiff surface) With p = 3: 0 1 1 2 2 3 1 2 y x x x x β β β β ε = + + + + (Twisted surface) Chapter 8, Page 1 of 25

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Example – 2: We want to predict patients’ “well-being” from the dosage of medicine they take (mg.) using a quadratic model: 2 0 1 2 ( ) y x x β β β ε = + + + Here X = Dosage of the active ingredient (in mg’s), and p = 2. Example – 3: Suppose we want to predict Y = the highway mileage of a car using X 1 = its city mileage and X 2 = its size (a categorical variable) where, 2 0 if car is compact X 1 if car is larger y = The model we may use is 0 1 2 2 2 3 1 2 ( ) y x x x x β β β β ε = + + + + Note that the last term 3 1 2 ( ) x x β is for interaction which allows for NON-parallel lines. Chapter 8, Page 2 of 25
The Multiple Linear Regression Model: 0 1 1 2 2 p p y x x x β β β β ε = + + + + L Assumptions: 1) ε ~ N(0, σ ) [ Error terms are iid normal with mean zero and constant standard deviation σ ]. 2) As a result of this, we have, Y ~ N(µ Y , σ ), for every combination of x 1, x 2 , …, x p . That is, the response (Y) has a normal distribution with mean µ Y (that depends on the values of the independent variables, x’s) and a constant standard deviation, σ (that does not depend on the values of X’s). We use data to find the Fitted Equation or Prediction Equation 0 1 1 2 2 ˆ p p y b b x b x b x = + + + L ANOVA F-test: Overall test of “goodness” of the model Ho: β 1 = β 2 = β 3 = … = β p = 0 NOTHING GOOD in model Ha: at least one of β ’s ≠ 0 SOMETHING IS GOOD. Test Statistic : MSReg F MSE = P-Value from the tables of the F-distribution with df 1 = p = degrees of freedom of MSReg df 2 = n – p – 1 = degrees of freedom of MSE ANOVA for Multiple Regression Model Source df SS MSE F Regression (Model) p SSReg SSReg MSReg p = MSReg F MSE = Residual (Error) n – p – 1 SSE 1 SSE MSE n p = - - Total n – 1 SST Chapter 8, Page 3 of 25

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Testing for Individual β ’s: Computer output from Minitab: Regression Analysis Y vs. X 1 , X 2 , …, X p Predictor Coef SE Coef T P Constant b 0 SE(b 0 ) b 0 /SE(b 0 ) .
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}