12 - variables and their correlation r The least-squares...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Regression line: A straight line that describes how a response variable y changes as an explanatory variable x changes. You can use a regression line to predict the value of y for any value of x by substituting this x into the equation of the line. Least-squares regression line: The line that makes the sum of the squares of the vertical distances of the data points from the line as small as possible. Equation of the least − squares regression line We have data on an explanatory variable x and a response variable y for n individuals. From the data, calculate the means and and the standard deviations sx and sy of the two
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: variables, and their correlation r. The least-squares regression line is the line ŷ = a + bx with slope and intercept residual/error = observed response − predicted response The statistical model for a straight line is y-hat = a + bx. The sign of slope (positive or negative) is the same as the sign of r, the correlation coefficient (positive or negative). "Causation" cannot be established from only one observational study. Since y-intercept is a=y-bar-b*(x-bar), if slope, b, equals zero, then the best predictor for y is y-bar, the mean of the y's. Recall: y-hat = a + bx...
View Full Document

This note was uploaded on 02/27/2012 for the course STAT 121 taught by Professor Patticolling during the Winter '11 term at BYU.

Ask a homework question - tutors are online