{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Stat 312: Lecture 19 Least squares estimation Moo K. Chung [email protected] Nov 30, 2004 Concepts 1. Least squares method. In the previous lecture, we stud- ied the least squares estimation method for estimating parameters in linear regression. This method can be used to estimate other parameters in a different model. Given measurements y 1 , y 2 , · · · , y n , they can be modeled as Y i = μ + ² i , where E ² i = 0 and V ² i = σ 2 with no assumption of normality. We are interested in estimating μ = E Y , the population mean. Let ˆ μ be the estimator of μ . Then the predicted value is ˆ y i = ˆ μ and the residual error is r i = y i - ˆ y i . So the total sum of the squared errors (SSE) is SSE = n X i =1 ( y i - ˆ μ ) 2 . To find minimum of SSE, we differentiate SSE with re- spect to ˆ μ and get ˆ μ = ¯ y . 2. Weighted least squares method. Suppose we have two population. Measurement are taken from the first pop- ulation: x 1 , x 2 , · · · , x m and the second population: y 1 , y 2 , · · · , y n . They are modeled as X i = μ + ² i with E ² i
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}