multiple regression noteshells 2

multiple regression noteshells 2 - Multiple Regression Part...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Multiple Regression I 1 Multiple Regression 1 1 Multiple Regression Part 1: Introduction 14.1-14.3 Multiple Regression 1 2 Multiple Regression h The data set consists of n observations (often called cases). h Each case has Y and two or more Xs: X 1 , X 2 , …, X k h Some Xs may be functions of other Xs: X 3 = X 1 /X 2 or X 3 = X 1 X 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Multiple Regression I 2 Multiple Regression 1 3 Book's Example n = 34 stores in a chain Y = Monthly sales of the OmniPower bar X 1 = Price of the bar in cents. X 2 = In-store promotion expenditures (signs, displays, coupons, etc.) Used three prices and three promo levels Multiple Regression 1 4 Exploratory Tools h Multiple scatter plots – can use PhStat or Excel directly h Correlations – use Correl function several times h Correlation matrix (requires Data Analysis Tool Kit)
Background image of page 2
Multiple Regression I 3 Multiple Regression 1 5 OmniPower Sales Sales Price Promotion Sales 1 Price -0.7351 1 Promotion 0.5351 -0.0968 1 Sales versus Promotion Budget 0 1000 2000 3000 4000 5000 6000 100 200 300 400 500 600 700 Promotion Budget Sales Sales versus Price 0 1000 2000 3000 4000 5000 6000 50 60 70 80 90 100 Price Multiple Regression 1 6 Quick impressions h The relationships appear sensible: sales increase as price drops or promotion rises. h Both variables look like useful predictors with price (r = -.735) better than promotion (r = .535). h There is very little correlation between price and promotion, which is good.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 Multiple Regression 1 7 Estimation of Parameters Use the least squares approach again ^ Y = b 0 + b 1 X 1 + b 2 X 2 + … + b k X k Find ( b 0 , b 1 , b 2 , …, b k ) to minimize the sum of squared residuals Multiple Regression 1 8 PhStat Output Two-variable regression Regression Statistics Multiple R 0.8705 R Square 0.7577 Adjusted R Square 0.7421 Standard Error 638.0653 Observations 34 ANOVA df SS MS F Significance F Regression 2 39472730.77 19736365.39 48.4771 0.0000 Residual 31 12620946.67 407127.31 Total 33 52093677.44 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 5837.5208 628.1502 9.2932 0.0000 4556.3999 7118.6416
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 13

multiple regression noteshells 2 - Multiple Regression Part...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online