Stat 312: Lecture 19 Least squares estimation Moo K. Chung firstname.lastname@example.org Nov 30, 2004 Concepts 1. Least squares method. In the previous lecture, we stud-ied the least squares estimation method for estimating parameters in linear regression. This method can be used to estimate other parameters in a different model. Given measurements y 1 ,y 2 , ··· ,y n , they can be modeled as Y i = μ + ² i , where E ² i = 0 and V ² i = σ 2 with no assumption of normality. We are interested in estimating μ = E Y , the population mean. Let ˆ μ be the estimator of μ . Then the predicted value is ˆ y i = ˆ μ and the residual error is r i = y i-ˆ y i . So the total sum of the squared errors (SSE) is SSE = n X i =1 ( y i-ˆ μ ) 2 . To ﬁnd minimum of SSE, we differentiate SSE with re-spect to ˆ μ and get ˆ μ = ¯ y . 2. Weighted least squares method. Suppose we have two population. Measurement are taken from the ﬁrst pop-ulation: x 1 ,x 2 , ··· ,x m and the second population:
This is the end of the preview. Sign up
access the rest of the document.
This note was uploaded on 01/31/2008 for the course STAT 312 taught by Professor Chung during the Fall '04 term at Wisconsin.