17 - r2 and Intro to Multiple Regression - webct

# 17 - r2 and Intro to Multiple Regression - webct - r2 and...

This preview shows pages 1–4. Sign up to view the full content.

3/27/2008 1 r 2 and Intro to Multiple Regression Psyc110 March 27, 2008 Outline r -squared interpretations Proportion of variability Venn diagrams for showing variances graphically Sums of squares: breaking them down Example calculation Multiple regression introduction r 2 : Overview r provides a measure of strength of (linear) association But is limited, somewhat, because not directly interpretable All we need to do to make it more interpretable is square it! r 2 = “proportion of variance accounted for”

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
3/27/2008 2 Characteristics of r 2 Will always range from 0 to 1 (positive) Will always be a “smaller” number than r Has specific relations with different “sum of squares” calculations in regression Provides another way to interpret Interpreting r 2 Example: calculations from MSRA dataset (last class) r = .37, therefore r 2 = (.37 2 ) = .14 14% of the variance in adolescent behavior problem scores is accounted for by infant attachment disorganization” (Does NOT mean 14% of individual scores are predicted) (Does NOT mean 14% error) Y Here, each circle’s area represents the variance of a variable( Venn diagram ) X 14% of the area of Y overlaps with X Sums of Squares in Regression For interpretive purposes, it can be useful to separate out variability into different components Review: SS X = ∑(X – Xbar) 2 SS Y = ∑(Y – Ybar) 2 Here, we are most interested in breaking down SS Y because it is what we are trying to predict
3/27/2008 3 Breaking Down SS Y In regression, SS Y can be separated into two additive components: Variability due to variability in predictor ( SS Yhat ) Variability NOT due to variability in predictor ( SS error ) SS error we have already seen, when calculating the standard error of the estimate: ? ±−± = ² ³ ( ± − ± ) 2 ° − 2 SS error is the numerator of the inner fraction, or: SS error = ∑(Y – Y-hat) 2 What About SS Yhat ? SS Yhat represents variability in Y directly attributable to variability in the predictor SS Yhat = ∑(Yhat – Ybar) 2 Prettier formula: ´´ ± µ ( ± − ± ) 2 Predicted Y (based on X) Mean of Y (best guess, not

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 10

17 - r2 and Intro to Multiple Regression - webct - r2 and...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online