E7Lecture14Summary - E7 Lecture 14: Generalized Linear...

Info iconThis preview shows pages 1–12. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: E7 Lecture 14: Generalized Linear Least Squares Regression Tad Patzek, Civil & Environmental Engineering, U.C. Berkeley March 19, 2008 Subjects Already Covered. . . In Lecture 11, we have learned about Mean, variance, standard deviation, covariance Example, trends of cigarette data Least squares regression to a straight line Prof. T.W. Patzeks E7 Lecture 14: . . . p.1/28 Subjects Covered in Lecture 14 Today we will learn about Generalized Linear Least Squares regression to a polynomial The related MATLAB files and Lecture 14 have been posted on bspace Prof. T.W. Patzeks E7 Lecture 14: . . . p.2/28 Population sample. . . Let y 1 ,y 2 ,y 3 ,...,y N represent a random sample of size N from any population Prof. T.W. Patzeks E7 Lecture 14: . . . p.3/28 Sample mean . . . Estimate of = = = y 1 + y 2 + + y N N = N i =1 y i N Prof. T.W. Patzeks E7 Lecture 14: . . . p.4/28 Sample variance s 2 . . . Estimate of 2 = s 2 = N i =1 ( y i- ) 2 N- 1 Prof. T.W. Patzeks E7 Lecture 14: . . . p.5/28 Sample standard deviation s . . . Estimate of = s = radicalBigg N i =1 ( y i- ) 2 N- 1 Prof. T.W. Patzeks E7 Lecture 14: . . . p.6/28 Fitting data to straight line. . . We have N measurements of a response variable ( e.g. , nicotine content), { y i } , at discrete values of an explanatory variable ( e.g. , tar content or time), { x i } , where N is very large Prof. T.W. Patzeks E7 Lecture 14: . . . p.7/28 Fitting data to straight line. . . At first, we want to consider the simplest possible model of our data, a straight line: y ( x ) = a + a 1 x = y ( x ; a , a 1 ) This problem is called linear regression Prof. T.W. Patzeks E7 Lecture 14: . . . p.8/28 Fitting data to straight line We assume that the uncertainty (noise), i , associated with each measurement y i is known, and that we know exactly each value of x i Prof. T.W. Patzeks E7 Lecture 14: . . . p.9/28 Example. . .Example....
View Full Document

This note was uploaded on 03/04/2009 for the course E 7 taught by Professor Patzek during the Spring '08 term at University of California, Berkeley.

Page1 / 29

E7Lecture14Summary - E7 Lecture 14: Generalized Linear...

This preview shows document pages 1 - 12. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online