lecture20 notes

Probability and Statistics for Engineering and the Sciences (with CD-ROM and InfoTrac )

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Stat 312: Lecture 19 Least squares estimation Moo K. Chung mchung@stat.wisc.edu Nov 30, 2004 Concepts 1. Least squares method. In the previous lecture, we stud- ied the least squares estimation method for estimating parameters in linear regression. This method can be used to estimate other parameters in a different model. Given measurements y 1 ,y 2 , ··· ,y n , they can be modeled as Y i = μ + ² i , where E ² i = 0 and V ² i = σ 2 with no assumption of normality. We are interested in estimating μ = E Y , the population mean. Let ˆ μ be the estimator of μ . Then the predicted value is ˆ y i = ˆ μ and the residual error is r i = y i - ˆ y i . So the total sum of the squared errors (SSE) is SSE = n X i =1 ( y i - ˆ μ ) 2 . To find minimum of SSE, we differentiate SSE with re- spect to ˆ μ and get ˆ μ = ¯ y . 2. Weighted least squares method. Suppose we have two population. Measurement are taken from the first pop- ulation: x 1 ,x 2 , ··· ,x m and the second population:
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/31/2008 for the course STAT 312 taught by Professor Chung during the Fall '04 term at Wisconsin.

Ask a homework question - tutors are online