Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

This preview shows page 1 out of 1 page.

Stat 312: Lecture 18 Least squares Estimation Moo K. Chung [email protected] April 1, 2003 Concepts 1. Least squares estimation . Given paired data ( x 1 , y 1 ) , · · · , ( x n , y n ) , we find a line that minimizes the sum of the squared errors (SSE): SSE = n X j =1 r 2 j = n X j =1 ( y j - ˆ y j ) 2 = n X j =1 ( y j - β 0 - β 1 x j ) 2 . Then the regression line is y = ˆ 0 + ˆ β 1 x . 2. By differentiating SSE with respect to β 0 and β 1 , we get normal equations : β 0 + 1 = y 0 + x 2 β 1 = xy Solving these equations, we get ˆ β 1 = S xy S xx and ˆ β 0 = y - x S xy S xx , where the sample covariance S xy = n ( xy - ¯ x ¯ y ) . 3. Outliers are data points with unusually large residuals. Outliers might cause the lack of fit of a regression line. 4. σ 2 determines the amount of variability inherent in the linear model. An unbiased estimator of σ 2 is ˆ σ 2 = SSE n - 2 . One simple way of computing this is to use SSE = S yy - ˆ β 1 S xy = S yy - S 2 xy /S xx . In-class problems Example 1. 10 students took two midterm exams. Student k 01 02 03 04 05 06 07 08 09 10 Midterm 1 k 80 75 60 90 99 60 55 85 65 70 Midterm 2 k 70 60 70 72 95 66 60 80 70 60 Let’s find the regression line. > x<-c(80, 75, 60, 90, 99, 60, 55, 85, 65, 70) > y<-c(70, 60, 70, 72, 95, 66, 60,

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture