MultipleRegression 12

# MultipleRegression 12 - Simple Regression (one predictor)...

This preview shows pages 1–19. Sign up to view the full content.

Simple Regression (one predictor) Review

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Facebook example Predicted z y = (r xy )(z x ) Predicted z y = (.66)(z x ) z x = -.45 Predicted z y = .66(-.45) = - .297 mean x = 9.25 s x = 6.833 mean y = 3.25 s y = 1.479 z y (s y ) + m y = Y -.297(1.479) + 3.25 = -.4393 + 3.25 = 2.8107 round to 3 z x = -.45 z x (s x ) + m x = X -.45(6.833) + 9.25 = -3.075 + 9.25 = 6.175, approx. 6 hrs
Scouts example Predicted z y = (r xy )(z x ) Predicted z y = (.203)(z x ) z x = +1.02 Predicted z y = .203(1.02) = .207 mean x = 15.818 s x = 11.629 mean y = 34.091 s y = 15.911 z y (s y ) + m y = Y .207(15.911) + 34.091 = 3.294 + 34.091 = 37.385 round to 37 boxes of cookies z x = +1.02 z x (s x ) + m x = X 1.02(11.629) + 15.818 = 11.862 + 15.818 = 27.687 round to 28 boxes of popcorn

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Facebook example Predicted Y = bX + a b = (r xy )(s y )/s x a = M y – (b)(M x ) M x = 9.25 s x = 6.833 M y = 3.25 s y = 1.479 b = [(.66)(1.479)]/6.833 = .97614/6.833 = .143 a = 3.25 - (.143)(9.25) = 3.25 - 1.32275 = 1.92725 predicted Y = .143(X) + 1.9275 X = 6 hours predicted Y = .143(6) + 1.9275 = .858 + 1.9275 = 2.7855, round to 3
Scouts example Predicted Y = bX + a b = (r xy )(s y )/s x a = M y – (b)(M x ) M x = 15.818 s x = 11.629 M y = 34.091 s y = 15.911 b = [(.203)(15.911)]/11.629 = 3.230/11.629 = .278 a = 34.091 - (.278)(15.818) = 34.091 – 4.397 = 29.694 predicted Y = .278(X) + 29.694 X = 10 boxes of popcorn predicted Y = .278(10) + 29.694 = 2.78 + 29.694 = 32.474 (approx 32) boxes of cookies

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Prediction Question: How much will the LOST audience enjoy THE EVENT? ?
Prediction to Prediction Error! X Actual Y 1 2 1 3 2 1 3 4 3 5 4 5 4 3 4 2 4 1 5 4 Predicted Y 2.378 2.378 2.674 2.97 2.97 3.266 3.266 3.266 3.266 3.562 Error = Actual Y-Predicted Y -0.378 0.622 -1.674 1.03 2.03 1.734 -0.266 -1.266 -2.266 0.438

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Calculating Error Error = Actual Y- Predicted Y -0.378 0.622 -1.674 1.03 2.03 1.734 -0.266 -1.266 -2.266 0.438 Calculate the average error = Sum of (Y-Predicted Y) / N
Problem ∑ (Actual Y – Predicted Y) = 0 Why? Because the regression line is the “balancing point” between all the (X,Y) pairs

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Residuals 2.378 2.674 2.97 3.266 3.562 2 3 1 4 5 5 3 2 1 4 0 1 2 3 4 5 6 0 1 2 3 4 5 6 Lost The Event Predicted Actual Predicted Y Actual Y X
Balancing Point Sound familiar? The mean is the balancing point for the set of scores So, ∑(X-M) = 0 To get around this problem: ∑(X-M)(X-M)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Method of Least Squares 1 – draw a line 2 – calculate predicted Y for each person 3 – compare predicted Y to actual Y 4 – square and sum the differences between predicted Y and actual Y 5 – divide by N 6 – this is called the mean square error (MSE), want it to be LOW
Eyeballing a Regression Line http://www.ruf.rice.edu/~lane/stat_sim/reg_by_eye/index.html

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Prediction using raw scores Advantages Do not need to convert scores to z-scores Easy to apply the formula to make predictions Disadvantages Can not compare across variables More difficult to calculate Formulas: Predicted Y = bX + a b = (r*s y )/s x a = M y – (b)(M x )
Prediction using z-scores Advantages Easy to calculate Can compare across variables Disadvantages Need to convert scores to z-scores More difficult to apply Formula: Predicted Z Y = r*Z X

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Simple Regression (one predictor) Regression Coefficients and Coefficient of Determination
Example: Predicting GPA SAT One predictor (one “X”)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Simple Linear Regression: SAT and GPA Coefficients a 1.876 .346 5.422 .002 .097 .034 .757 2.834 .030 (Constant) SAT Model 1 B Std. Error Unstandardized Coefficients Beta Standardized Coefficients t Dependent Variable: GPA a.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 05/26/2011 for the course PSCH 343 taught by Professor Victoriaharmon during the Spring '11 term at Ill. Chicago.

### Page1 / 65

MultipleRegression 12 - Simple Regression (one predictor)...

This preview shows document pages 1 - 19. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online