{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

lecture_2_3

# lecture_2_3 - Looking at data relationships Least-squares...

This preview shows pages 1–7. Sign up to view the full content.

Looking at data: relationships   Least-squares regression IPS chapter 2.3 © 2006 W. H. Freeman and Company

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Objectives (IPS chapter 2.3) Least-squares regression The regression line Making predictions: interpolation Coefficient of determination, r 2
Correlation tells us about strength (scatter) and direction of the linear relationship between two quantitative variables. In addition, we would like to have a numerical description of how both variables vary together. For instance, is one variable increasing faster than the other one? And we would like to make predictions based on that numerical description. But which line best describes our data?

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Distances between the points and line are squared so all are positive values. This is done so that distances can be properly added. The regression line The least-squares regression line is the unique line such that the sum of the squared vertical ( y ) distances between the data points and the line is the smallest possible.
Properties is the predicted y value (y hat) b is the slope a is the y -intercept ˆ y "a" is in units of y "b" is in units of y / units of x The least-squares regression line can be shown to have this equation: bx a y x s s r x s s r y y x y x y + = + - = ˆ or , ) ( ˆ

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
b = r s y s x First we calculate the slope of the line, b ; from statistics we already know: r is the correlation.
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}