06.03.1
Chapter 06.03
Linear Regression
After reading this chapter, you should be able to
1.
define regression,
2.
use several minimizing of residual criteria to choose the right criterion,
3.
derive the constants of a linear regression model based on least squares method
criterion,
4.
use in examples, the derived formulas for the constants of a linear regression model,
and
5.
prove that the constants of the linear regression model are unique and correspond to
a minimum.
Linear regression is the most popular regression model.
In this model, we wish to
predict response to
n
data points
)
,
(
),
......
,
,
(
),
,
(
2
2
1
1
n
n
y
x
y
x
y
x
by a regression model given
by
x
a
a
y
1
0
+
=
(1)
where
0
a
and
1
a
are the constants of the regression model.
A measure of goodness of fit, that is, how well
x
a
a
1
0
+
predicts the response variable
y
is the magnitude of the residual
i
ε
at each of the
n
data points.
)
(
1
0
i
i
i
x
a
a
y
E
+
−
=
(2)
Ideally, if all the residuals
i
ε
are zero, one may have found an equation in which all
the points lie on the model.
Thus, minimization of the residual is an objective of obtaining
regression coefficients.
The most popular method to minimize the residual is the least squares methods,
where the estimates of the constants of the models are chosen such that the sum of the
squared residuals is minimized, that is minimize
∑
=
n
i
i
E
1
2
.
Why minimize the sum of the square of the residuals?
Why not, for instance,
minimize the sum of the residual errors or the sum of the absolute values of the residuals?
Alternatively, constants of the model can be chosen such that the average residual is zero
without making individual residuals small.
Will any of these criteria yield unbiased

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*