06.03.1 Chapter 06.03 Linear Regression After reading this chapter, you should be able to 1.define regression, 2.use several minimizing of residual criteria to choose the right criterion, 3.derive the constants of a linear regression model based on least squares method criterion, 4.use in examples, the derived formulas for the constants of a linear regression model, and 5.prove that the constants of the linear regression model are unique and correspond to a minimum. Linear regression is the most popular regression model. In this model, we wish to predict response to ndata points ),(),......,,(),,(2211nnyxyxyxby a regression model given by xaay10+=(1) where 0aand 1aare the constants of the regression model. A measure of goodness of fit, that is, how well xaa10+predicts the response variable yis the magnitude of the residual iεat each of the ndata points. )(10iiixaayE+−=(2) Ideally, if all the residuals iεare zero, one may have found an equation in which all the points lie on the model. Thus, minimization of the residual is an objective of obtaining regression coefficients. The most popular method to minimize the residual is the least squares methods, where the estimates of the constants of the models are chosen such that the sum of the squared residuals is minimized, that is minimize ∑=niiE12. Why minimize the sum of the square of the residuals?Why not, for instance, minimize the sum of the residual errors or the sum of the absolute values of the residuals? Alternatively, constants of the model can be chosen such that the average residual is zero without making individual residuals small. Will any of these criteria yield unbiased
has intentionally blurred sections.
Sign up to view the full version.