{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ECOR 2606 - Lecture 17

ECOR 2606 - Lecture 17 - LinearRegression...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
3/10/2010 1 General linear least squares regression: Linear Regression: Fit a curve of the form y = ax + b to data points Use polyfit ( x , y , 1 ) Polynomial Regression: Fit a curve of the form y = a n x n + a n 1 x n 1 +… a 1 x + a 0 to data points Use polyfit ( x , y , n ), where n is the order of the polynomial Linear regression is a special case ( n = 1) of polynomial regression General linear least squares regression: Fit a curve of the form y = a 0 z 0 (x) + a 1 z 1 (x) +… a m z m (x) to data points z 0 ,z 1 , … z m ( the basis functions) are arbitrary functions of x Polynomial regression is a special case ( z 0 = 1, z 1 = x, z 2 = x 2 , …) of the general case Example: We can fit a curve of the form y = a 0 (1) + a 1 cos( ω x ) + a 2 sin( ω x ) to data points In this case the basis functions are (1), cos( ω x ), and sin( ω x ) a 0, a 1 , and a 2 are chosen to as to minimize the sum of the squares of the errors The “linear” in general linear least squares regression come from that fact that y is a linear combination of functions of x . The basis functions themselves can be highly nonlinear (e.g. sin and cos ) The basis functions must involve only constants and x y = a 0 (1 exp( a 1 x )) is unacceptable as it cannot be converted to the required form
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
3/10/2010 2
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 6

ECOR 2606 - Lecture 17 - LinearRegression...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon bookmark
Ask a homework question - tutors are online