{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

22-Least-Squares-fitting

# 22-Least-Squares-fitting - Least-squares tting Regression...

This preview shows pages 1–3. Sign up to view the full content.

Least-squares fitting Regression line Polynomial least squares General models Least-Squares Data Fitting Dhavide Aruliah UOIT MATH 2070U c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 1 / 24 Least-squares fitting Regression line Polynomial least squares General models Least-Squares Data Fitting 1 Least-squares data-fitting 2 Least-squares fitting with straight line (regression line) 3 Polynomial least-squares approximation 4 Least-squares fitting of general models c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 2 / 24 Least-squares fitting Regression line Polynomial least squares General models Data-fitting problem Data-fitting problem Given n + 1 data points { ( x 0 , y 0 ) , ( x 1 , y 1 ) , . . . , ( x n , y n ) } with x k distinct ( k = 0 : n ), determine a function f that approximately satisfies f ( x k ) = y k ( k = 0: n ) . -5 -4 -3 -2 -1 0 1 2 3 4 5 -3 -2 -1 0 1 2 3 c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 4 / 24 Least-squares fitting Regression line Polynomial least squares General models Least-squares approximation To solve data-fitting problem, state more precisely Assume f V , with V vector space dimension m + 1 Basis of linearly independent functions for V is { ψ ( x ) } m = 0 = { ψ 0 ( x ) , ψ 1 ( x ) , . . . , ψ m ( x ) } Any g V admits representation g ( x ) = m = 0 b ψ ( x ) Least squares approximation n k = 0 y k - f ( x k ) 2 n k = 0 [ y k - g ( x k )] 2 for every g V c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 5 / 24

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Least-squares fitting Regression line Polynomial least squares General models Regression line Let f be straight line, i.e., f ( x ) = a 0 + a 1 x Given n + 1 data points { ( x k , y k ) } n k = 0 , straight line function f to minimise data misfit Define objective function Φ by Φ ( b 0 , b 1 ) : = n k = 0 [ y k - ( b 0 + b 1 x k )] 2 Desired coefficients are minimisers a 0 , a 1 such that Φ ( a 0 , a 1 ) Φ ( b 0 , b 1 ) for every b 0 , b 1 R c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 7 / 24 Least-squares fitting Regression line Polynomial least squares General models Solving multivariate optimisation problem Necessary conditions for minimiser of Φ ( b 0 , b 1 ) : ∂Φ b 0 ( a 0 , a 1 ) = 0 ∂Φ b 1 ( a 0 , a 1 ) = 0 These equations translate to n k = 0 [ a 0 + a 1 x k - y k ] = 0 n k = 0 a 0 x k + a 1 x 2 k - y k x k = 0 Linear equations in unknown coefficients a 0 , a 1 c D. Aruliah (UOIT) Least-Squares Data Fitting MATH 2070U 8 / 24 Least-squares fitting Regression line
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern