ECOR 2606 - Lecture 16

andb log uselinearregressiontofindbesta

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: e linear regression to find best a and b. • Then find α and β by applying α =1/b and β = a/b The mathematics of linear regression: Given : ( x1 , y1 ), ( x2 , y2 ), ( x3 , y3 )...( xn , yn ) To find : the straight line ( y ax b) that best fits the data n We must minimize E yi (axi b) 2 i 1 n a 2 xi b 2 yi 2abxi 2axi yi 2byi i 1 2 2 At the minimum : n E 2 2axi 2bxi 2 xi yi 0 a i 1 n E 2b 2axi 2 yi 0 b i 1 4 3/8/2010 Dividing both equations by 2 and expressing them in matrix form gives: xi 2 xi x a x y (1) b y i i i n where (1) (1) n i 1 i Solving using Cramer ’s Rule produces: a b1 b2 a12 a22 n xi yi xi yi 2 2 A n xi xi a11 b1 a21 b2 b A x y x x y n x x 2 i i 2 i i i 2 i i Aside: b is more easily calculated using b y ax Calculating of a and b involves first passing through the data points and calculating the following summations: y y x i i i 2 x y i i Once this is done formulas for a and b can be applied. For linear regression ONLY, the correlation coefficient r can be computed using: r n xi yi xi yi n xi xi 2 2 n yi yi 2 2 In addition to the summations listed above this requires y 2 i 5 3/8/2010 Linear Regression and the Casio Calculator: formula is y = Ax+B Mode Mode 2 (REG) REG stands for regression 1 (LIN) LIN stands for linear SHIFT CLR 1 (Scl) = clear statistical memory x1 , y1 DT the DT key is the M+ key x2 , y2 DT …. and so on until all points entered To retrieve value of A: SHIFT S‐VAR ‐> ‐> 1 (A) = the S‐VAR key is the 2 key, ‐> is right arrow To retrieve value of B: SHIFT S‐VAR ‐> ‐> 2 (B) = To retrieve the correlation coefficient: SHIFT S‐VAR ‐> ‐> 3 (r) = Other forms of regression are also supported. Polynomial regression: Linear regression involves fitting a first order polynomial (i.e. a polynomial of the form ax + b) to a set of data points. The basic idea is readily extended to higher order polynomials. Example: X: 0 Y: 189.4 3 95.1 6 34.1 9 1.8 12 7.3 15 46.7 18 131.9 21 253.2 We want to fit a quadratic (i.e. a polynomial of the form y = ax2+bx+c) to the data. This can be done by using polyfit and specifying a second order polynomial. >> polyfit (x, y, 2) % 2 for second order The result is a 3 element containing a, b, and c (in that order). 6 3/8/2010 >> xplot = linspace (‐1, 22, 100); >> yplot = polyval (p, xplot); >> plot (x, y, 'o', xplot, yplot, 'MarkerSize', 10); >> fprintf ('The best fit curve is %6.4f * x^2 + %6.4f * x + %6.4f\n',... p(1), p(2), p(3)); The best fit curve is 2.0088 * x^2 + ‐39.5105 * x + 193.4125 >> f = @(x) p(1) * x .^ 2 + p(2) * x + p(3); >> r = correlate (x, y, f); >> fprintf ('The correlation coefficient is %6.4f\n', r); The correlation coefficient is 0.9991 7 3/8/2010 The mathematics of quadratic regression: Given : ( x1 , y1 ), ( x2 , y2 ), ( x3 , y3 )...( xn , yn ) To find : the quadratic ( y ax 2 bx c) that best fits the data n We must minimize E yi (axi bxi c) i 1 At the minimum 2 E E E 0 a b c 2 First order equations Filling in the details gives : xi 4 3 xi x 2 i x x a x y x x b x y x (1) c y 3 i 2 i 2 i i 2 i i i i i n where (1) (1) n i 1 i The values of a, b, and c can be found by solving this series of equations. Equations = first order equations plus extra row and column. This pattern extends to higher order polynomials. 8...
View Full Document

This note was uploaded on 09/13/2013 for the course ECOR 2606 taught by Professor Goheen during the Fall '10 term at Carleton CA.

Ask a homework question - tutors are online