Chapter 11--Regression and Correlation Methods

006 0931 0000 0161 vif 1095 1513 2193 2544 1139

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 9469.843 df 5 45 50 MS 13168.747 302.802 F 43.490 Sig. 0.000 Inference on the individual partial slopes. Variable (Constant) Age Bed Bath Size Lot ˆ βj 38.505 -0.257 -12.636 -0.545 61.882 0.001 SE 13.682 0.204 4.424 6.243 6.067 0.000 Chapter 11: Regression and Correlation Methods t 2.814 -1.259 -2.856 -0.087 10.200 1.426 Stat 491: Biostatistics p-value 0.007 0.215 0.006 0.931 0.000 0.161 VIF 1.095 1.513 2.193 2.544 1.139 Introduction Least Square Estimates of the Parameters Inference about the Parameters Prediction Assessing Adequacy of Fit Correlation Multiple Regression Introduction Inferences in Multiple Regression Tests for Subset of Regression Coefficients Prediction (Forecasting) Dummy Variables An Example: Determinants of House Price Cont’d... How do you interpret the coefficient of Bedroom? Does this interpretation make sense? The Age, Number of Bathroom and Lot size do not separately have predictive value the price of the house? Does this make sense? Is there any evidence that the any of the predictors in involved in collinearity? Do you think Age, Number of Bathroom, and Lot Size collectively have any predictive value? Here keep in mind that multicollinearity is not an issue. In general, suppose H0 : β1 = β2 = · · · = βk is rejected but none of H0 : βj = 0 is rejected. What would this tell us? Chapter 11: Regression and Correlation Methods Stat 491: Biostatistics Introduction Least Square Estimates of the Parameters Inference about the Parameters Prediction Assessing Adequacy of Fit Correlation Multiple Regression Introduction Inferences in Multiple Regression Tests for Subset of Regression Coefficients Prediction (Forecasting) Dummy Variables Suppose we are interested in testing the additional (unique) predictive value xg +1 , xg +2 . . . , xk given x1 , x2 , . . . , xg . That is we are interested in testing H0 : βg +1 = βg +2 = . . . = βk = 0 vs Ha : at least on of βg +1 , βg +2 , . . . , βk is not zero Recall that TSS = (y − y )2 is a measure of the total ¯ variability in y . With the knowledge of the full set of variables x1 , x2 , . . . , xk i.e. under the full model TSS = SSRf + SSEf . With the knowledge of only x1 , x2 , . . . , xg only i.e . under the reduced model, TSS = SSRr + SSEr . Chapter 11: Regression and Correlati...
View Full Document

This note was uploaded on 02/03/2014 for the course STAT 491 taught by Professor Solomonharrar during the Fall '12 term at Montana.

Ask a homework question - tutors are online