lecture18 notes

Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 312: Lecture 18 Least squares Estimation Moo K. Chung mchung@stat.wisc.edu April 1, 2003 Concepts 1. Least squares estimation . Given paired data ( x 1 , y 1 ) , , ( x n , y n ) , we find a line that minimizes the sum of the squared errors (SSE): SSE = n X j =1 r 2 j = n X j =1 ( y j- y j ) 2 = n X j =1 ( y j- - 1 x j ) 2 . Then the regression line is y = a + 1 x . 2. By differentiating SSE with respect to and 1 , we get normal equations : + x 1 = y x + x 2 1 = xy Solving these equations, we get 1 = S xy S xx and = y- x S xy S xx , where the sample covariance S xy = n ( xy- x y ) . 3. Outliers are data points with unusually large residuals. Outliers might cause the lack of fit of a regression line. 4. 2 determines the amount of variability inherent in the linear model. An unbiased estimator of 2 is 2 = SSE n- 2 ....
View Full Document

Ask a homework question - tutors are online