22-Least-Squares-fitting

22-Least-Squares-fitting - Least-squares fitting Regression...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Least-squares fitting Regression line Polynomial least squares General models Least-Squares Data Fitting Dhavide Aruliah UOIT MATH 2070U c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 1 / 24 Least-squares fitting Regression line Polynomial least squares General models Least-Squares Data Fitting 1 Least-squares data-fitting 2 Least-squares fitting with straight line (regression line) 3 Polynomial least-squares approximation 4 Least-squares fitting of general models c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 2 / 24 Least-squares fitting Regression line Polynomial least squares General models Data-fitting problem Data-fitting problem Given n + 1 data points { ( x , y ) , ( x 1 , y 1 ) , . . . , ( x n , y n ) } with x k distinct ( k = 0 : n ), determine a function e f that approximately satisfies e f ( x k ) = y k ( k = 0: n ) .-5-4-3-2-1 1 2 3 4 5-3-2-1 1 2 3 c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 4 / 24 Least-squares fitting Regression line Polynomial least squares General models Least-squares approximation To solve data-fitting problem, state more precisely Assume e f V , with V vector space dimension m + 1 Basis of linearly independent functions for V is { ` ( x ) } m ` = = { ( x ) , 1 ( x ) , . . . , m ( x ) } Any g V admits representation g ( x ) = m ` = b ` ` ( x ) Least squares approximation n k = h y k- e f ( x k ) i 2 n k = [ y k- g ( x k )] 2 for every g V c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 5 / 24 Least-squares fitting Regression line Polynomial least squares General models Regression line Let e f be straight line, i.e., e f ( x ) = a + a 1 x Given n + 1 data points { ( x k , y k ) } n k = , straight line function e f to minimise data misfit Define objective function by ( b , b 1 ) : = n k = [ y k- ( b + b 1 x k )] 2 Desired coefficients are minimisers a , a 1 such that ( a , a 1 ) ( b , b 1 ) for every b , b 1 R c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 7 / 24 Least-squares fitting Regression line Polynomial least squares General models Solving multivariate optimisation problem Necessary conditions for minimiser of ( b , b 1 ) : b ( a , a 1 ) = b 1 ( a , a 1 ) = These equations translate to n k = [ a + a 1 x k- y k ] = n k = a x k + a 1 x 2 k- y k x k = Linear equations in unknown coefficients a , a 1 c D. Aruliah (UOIT)c D....
View Full Document

Page1 / 5

22-Least-Squares-fitting - Least-squares fitting Regression...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online