Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 312: Lecture 18 Least squares Estimation Moo K. Chung [email protected] April 1, 2003 Concepts 1. Least squares estimation . Given paired data ( x 1 , y 1 ) , · · · , ( x n , y n ) , we find a line that minimizes the sum of the squared errors (SSE): SSE = n X j =1 r 2 j = n X j =1 ( y j- ˆ y j ) 2 = n X j =1 ( y j- β- β 1 x j ) 2 . Then the regression line is y = ˆ aβ + ˆ β 1 x . 2. By differentiating SSE with respect to β and β 1 , we get normal equations : β + xβ 1 = y xβ + x 2 β 1 = xy Solving these equations, we get ˆ β 1 = S xy S xx and ˆ β = y- x S xy S xx , where the sample covariance S xy = n ( xy- ¯ x ¯ y ) . 3. Outliers are data points with unusually large residuals. Outliers might cause the lack of fit of a regression line. 4. σ 2 determines the amount of variability inherent in the linear model. An unbiased estimator of σ 2 is ˆ σ 2 = SSE n- 2 ....
View Full Document

This note was uploaded on 01/31/2008 for the course STAT 312 taught by Professor Chung during the Spring '04 term at University of Wisconsin.

Ask a homework question - tutors are online