This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: 6.6 Applications of Least Squares Many applications in science and engineering, e.g. for statistical data analysis Instead of notation Avectorx = vector b , write X vector β = vectory X : design matrix vector β : parameter vector vectory : observation vector L.S. curve fitting Basic idea: Given data points ( x 1 , y 1 ) , · · · , ( x n , y n ), “fit” the data with a simple curve simplest case is to fit data with straight line b b b b b b | | | x 1 x j x n Data Point ( x j ,y j ) ( x j ,β + β 1 x j ) Residual y = β + β 1 x Point on line Figure 1: Fitting a line to experimental data ∴ find constants β , β 1 , so line y = β + β 1 x fits data in the sense that at each point x j predicted value β + β 1 x j ≈ y j , y j observed value. 1 L.S. Line y = β + β 1 x minimizes sum of squares of residuals minimizes ( β + β 1 x 1 − y 1 ) 2 + · · · + ( β + β 1 x n − y n ) 2 β + β 1 x called line of regression of y on x β , β 1 called linear regression coefficients. ∴ solve β + β 1 x 1 = y 1 β + β 1 x 2 = y 2 . . . β + β 1 x n = y n or X vector β = vectory X = 1 x 1 1 x 2 . . . . . . 1 x n , vector β = bracketleftBigg β β 1 bracketrightBigg , vector y = y 1 . . . y n Since columns of X are linearly independent (for x i ’s distinct), can solve normal equations X T X vector β = X T vectory This minimizes distance between...
View Full Document