Price and Advertising affect pie sales at = .05 From Excel output: Reject H0 for each variable CoefficientsStandard Errort StatP-valuePrice -24.97509 10.83213 -2.30565 0.03979 Advertising 74.13096 25.96732 2.85478 0.01449 Decision: Conclusion: Reject H0 Reject H0 /2=.025 -tα/2 Do not reject H0 0tα/2 /2=.025 -2.1788 2.1788

Chap 15-31 Confidence Interval Estimate for the Slope Confidence interval for the population slope β1 (the effect of changes in price on pie sales): Example: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price ib2/istbCoefficientsStandard Error…Lower 95%Upper 95%Intercept 306.52619 114.25389 …57.58835 555.46404 Price -24.97509 10.83213 …-48.57626 -1.37392 Advertising 74.13096 25.96732 …17.55303 130.70888 where t has (n –k –1) d.f.

Chap 15-32 Standard Deviation of the Regression Model The estimate of the standard deviation of the regression model is: MSEknSSEs1Is this value large or small? Must compare to the mean size of y for comparison

Chap 15-33 Regression StatisticsMultiple R 0.72213 R Square 0.52148 Adjusted R Square 0.44172 Standard Error 47.46341 Observations 15 ANOVAdfSSMSFSignificance FRegression 2 29460.027 14730.013 6.53861 0.01201 Residual 12 27033.306 2252.776 Total 14 56493.333 CoefficientsStandard Errort StatP-valueLower 95%Upper 95%Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404 Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392 Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888 The standard deviation of the regression model is 47.46 (continued) Standard Deviation of the Regression Model

Chap 15-34 The standard deviation of the regression model is 47.46 A rough prediction range for pie sales in a given week is Pie sales in the sample were in the 300 to 500 per week range, so this range is probably too large to be acceptable. The analyst may want to look for additional variables that can explain more of the variation in weekly sales (continued) Standard Deviation of the Regression Model 94.22(47.46)

Chap 15-35 Table 12.4 (p. 482) Data Structure for Multiple Regression with Two Input Variables

Chap 15-36 Box on Page 482 A Multiple Regression Model

Chap 15-37 Table 12.5 (p. 483) The Data of x1= Weight in Pounds, x2= Age, and y= Blood Pressure of 13 Males

Chap 15-38 Table 12.6 (p. 484) Regression Analysis of the Data in Table 5: Selected MINITAB Output

Chap 15-39 Table 12.7 (p. 487) A Regression Analysis of the Data in Example 2 Using SAS.

Chap 15-40 Multicollinearity Multicollinearity: High correlation exists between two independent variables This means the two variables contribute redundant information to the multiple regression model

Chap 15-41 Multicollinearity Including two highly correlated independent variables can adversely affect the regression results No new information provided Can lead to unstable coefficients (large standard error and low t-values) Coefficient signs may not match prior expectations (continued)

Chap 15-42 Some Indications of Severe Multicollinearity Incorrect signs on the coefficients Large change in the value of a previous coefficient when a new variable is added to the model A previously significant variable becomes insignificant when a new independent variable is added The estimate of the standard deviation of the model increases when a variable is added to the model

Chap 15-43 Detect Collinearity (Variance Inflationary Factor) VIFjis used to measure collinearity: If VIFj≥

#### You've reached the end of your free preview.

Want to read all 76 pages?

- Spring '08
- NAUS
- Statistics, Regression Analysis, R Square