Unformatted text preview: 3/10/2010 General linear leastsquares regression: Linear Regression: Fit a curve of the form y = ax+b to data points Use polyfit (x, y, 1) Polynomial Regression: Fit a curve of the form y = anxn+an1xn1+... a1x + a0 to data points Use polyfit (x, y, n), where n is the order of the polynomial Linear regression is a special case (n = 1) of polynomial regression General linear least squares regression: Fit a curve of the form y = a0z0(x) + a1z1(x) +... amzm(x) to data points z0 , z1, ... zm (the basis functions) are arbitrary functions of x Polynomial regression is a special case (z0 = 1, z1 = x, z2= x2, ...) of the general case Example: We can fit a curve of the form y = a0(1) + a1cos(x) + a2sin(x) to data points In this case the basis functions are (1), cos(x), and sin(x) a0, a1, and a2 are chosen to as to minimize the sum of the squares of the errors The "linear" in general linear least squares regression come from that fact that y is a linear combination of functions of x. The basis functions themselves can be highly nonlinear (e.g. sin and cos) The basis functions must involve only constants and x y = a0(1exp(a1x)) is unacceptable as it cannot be converted to the required form 1 3/10/2010 For the first data point x1 , y1 : y1 a0 z0 ( x1 ) a1 z1 ( x1 ) a2 z 2 ( x1 ) ... am z m ( x1 ) e1 where e1 is the error for this point Similarly for the second data point x2 , y2 : y2 a0 z0 ( x2 ) a1 z1 ( x2 ) a2 z 2 ( x2 ) ... am z m ( x2 ) e2 And so on. In matrix form : y Za e z0 ( x1 ) z1 ( x1 ) z (x ) z (x ) 1 2 where Z 0 2 h ... ... z0 ( xn ) z1 ( xn ) ... z m ( x1 ) ... z m ( x2 ) ... ... ... z m ( xn ) Z has one row for every data point Z has one column for every basis function Zij is the jth basis function evaluated at xi We want to find the a that minimizes the sum of the squares of the elements of e The desired a can be found by solving ZTZ a = Zty (see next slide for proof) Z is n x (m + 1) ZT is (m + 1) x n ZTZ is (m + 1) x (m + 1) a is (m + 1) x 1 y is n x 1 Zty is (m + 1) x 1 These m+1 equations in m+1 unknowns are called the normal equations Example: % basis functions = (1), cos(x), sin(x) Z = zeros (length(x), 3); for k 1 : length(x) for k = 1 : length(x) Z(k, 1) = 1 ; Z(k,2) = cos (w*x(k)); Z(k,3) = sin (w*x(k)); end Zt = Z'; a = (Zt* Z) \ (Zt * y); % solve ZTZ a = Zty 2 3/10/2010 For interested students only, not in text: y Za e e y Za S r ei eT e
i 1 n y Za y Za y T a T Z T y Za T y T y y T Za a T Z T y a T Z T Za S r 2 Z T y 2 Z T Za a at minimum S r : Z T Za Z T y S r 2 Z T y 2 Z T Za 0 a QR Factorization: ZTZ a = Zty can be ill conditioned (sensitive to roundoff errors) QR factorization is more robust Z is factored into Q and R (Z = QR) Q is an (m+1) x (m+1) orthogonal matrix (Q1 = QT) R is an (m+1) x n upper triangular matrix We actually need only Q0 (first m+1 columns of Q) and R0 (first n rows of R) Q0R0 = Z (this is still the case because the rest of R is all zero) The least squares solution is found by solving R a Q The least squares solution is found by solving R0 a =Q0Ty Matlab: [Q0, R0] =qr (Z, 0); % the optional 0 gives only the required parts of Q and R a = R0\(Q0' *y); 3 3/10/2010 General least squares and left division: Assuming that n > m + 1 (more points than functions) Z a = y is overdetermined Number of equations is greater than number of unknowns No unique solution In such cases the left division operator produces a "best fit" (least squares) solution In s h ases the left di ision operator prod es a "best fit" (least sq ares) sol tion Once Z has been created a can be found directly using left division: a = Z \ y; % fit curve to data points Matlab uses QR factorization to find the least squares solution Example: x = [ 8.0 6.0 4.0 2.0 0 2.0 4.0 6.0 8.0]'; y = [ 817.5 279.9 139.4 41.6 23.8 8.7 36.0 158.1 339.4 ]'; We want to fit a curve of the form y = a0x3+a1x2+a2x+a3 This is polynomial regression: This is polynomial regression: p = polyfit (x, y, 3); % p is 1.1105 3.2687 0.2947 0.7874 f = @(x) p(1) * x .^ 3 + p(2) * x .^ 2 + p(3) * x + p(4); r = correlate (x, y, f); % r is 0.9936 4 3/10/2010 We can also use general least squares regression with z0(x) =x3, z1(x) =x2, z2(x) =x, z3(x) =1 Creating Z: Z = zeros (length(x), 4); % pre allocate for efficiency Z zeros (length(x) 4); % pre allocate for efficiency for k = 1 : length(x) Z(k, 1) = x(k)^3; Z(k, 2) = x(k)^2; Z(k, 3) = x(k); Z(k, 4) = 1; end Using normal equations: Using normal equations: Zt = Z'; a = (Zt* Z) \ (Zt * y) % solve Zt Z a = Zt y % a' is 1.1105 3.2687 0.2947 0.7874 Normal equations perhaps not a very good idea here (although the answer is OK): c = cond(Zt* Z); % c is 1.5999e+005, the matrix is very ill conditioned Using QR decomposition: Using QR decomposition: [Q0, R0] = qr(Z, 0); a = R0 \ (Q0' * y); % a' is 1.1105 3.2687 0.2947 0.7874 Using left division: a = Z \ y; % a' is 1.1105 3.2687 0.2947 0.7874 5 3/10/2010 Data points and fitted curve: 6 ...
View
Full
Document
This document was uploaded on 04/14/2010.
 Winter '09

Click to edit the document details