17 - r2 and Intro to Multiple Regression - webct

17 - r2 and Intro to Multiple Regression - webct -...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
3/27/2008 1 r 2 and Intro to Multiple Regression Psyc110 March 27, 2008 Outline r -squared interpretations Proportion of variability Venn diagrams for showing variances graphically Sums of squares: breaking them down Example calculation Multiple regression introduction r 2 : Overview r provides a measure of strength of (linear) association But is limited, somewhat, because not directly interpretable All we need to do to make it more interpretable is square it! r 2 = “proportion of variance accounted for”
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
3/27/2008 2 Characteristics of r 2 Will always range from 0 to 1 (positive) Will always be a “smaller” number than r Has specific relations with different “sum of squares” calculations in regression Provides another way to interpret Interpreting r 2 Example: calculations from MSRA dataset (last class) r = .37, therefore r 2 = (.37 2 ) = .14 14% of the variance in adolescent behavior problem scores is accounted for by infant attachment disorganization” (Does NOT mean 14% of individual scores are predicted) (Does NOT mean 14% error) Y Here, each circle’s area represents the variance of a variable( Venn diagram ) X 14% of the area of Y overlaps with X Sums of Squares in Regression For interpretive purposes, it can be useful to separate out variability into different components Review: SS X = ∑(X – Xbar) 2 SS Y = ∑(Y – Ybar) 2 Here, we are most interested in breaking down SS Y because it is what we are trying to predict
Background image of page 2
3/27/2008 3 Breaking Down SS Y In regression, SS Y can be separated into two additive components: Variability due to variability in predictor ( SS Yhat ) Variability NOT due to variability in predictor ( SS error ) SS error we have already seen, when calculating the standard error of the estimate: ? ±−± = ² ³ ( ± − ± ) 2 ° − 2 SS error is the numerator of the inner fraction, or: SS error = ∑(Y – Y-hat) 2 What About SS Yhat ? SS Yhat represents variability in Y directly attributable to variability in the predictor SS Yhat = ∑(Yhat – Ybar) 2 Prettier formula: ´´ ± µ ( ± − ± ) 2 Predicted Y (based on X) Mean of Y (best guess, not
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 10

17 - r2 and Intro to Multiple Regression - webct -...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online