Least Squares Fitting of Non-Linear Functions.
Least squares fitting is a method that fits parameters of a function such that the
function provides a best fit to a set of data points.
The simplest example is, of course, linear regression where we fit a function
Y x
( )
=
a
+
b
"
x
(1)
We have data points as value pairs
x
i
,
U
i
. To fit the parameters
a
and
b
we find
those which minimize the sum of the squared deviations
S
=
y
i
"
Y a
,
b
,
x
i
( )
( )
2
i
=
1
n
#
(2)
between the measured values
y
i
and the calculated values
Y
(
a,b,x
i
). Note that Y
is now also considered a function of the parameters
a
and
b
because in the fit
they are varied to minimize
S
.
The minimum is found by finding the point (a,b) where the derivatives of
S
with
respect to
a
and
b
are zero:
1
2
"
a
y
i
#
Y a
,
b
,
x
i
( )
( )
2
i
=
1
n
$
%
’
(
)
*
=
0,
1
2
b
y
i
#
Y a
,
b
,
x
i
( )
( )
2
i
=
1
n
$
%
’
(
)
*
=
0
y
i
#
Y a
,
b
,
x
i
( )
( )
i
=
1
n
$
Y a
,
b
,
x
i
( )
a
=
0,
y
i
#
Y a
,
b
,
x
i
( )
( )
i
=
1
n
$
Y a
,
b
,
x
i
( )
b
=
0
y
i
#
a
+
b
+
x
i
( )
( )
i
=
1
n
$
=
0,
y
i
#
a
+
b
+
x
i
( )
( )
i
=
1
n
$
#
x
i
( )
=
0
y
i
#
n
+
a
#
b
+
x
i
i
=
1
n
$
i
=
1
n
$
=
0,
y
i
+
x
i
#
a
x
i
i
=
1
n
$
#
b
+
x
i
2
i
=
1
n
$
i
=
1
n
$
=
0
We solve both equations for
a
:
a
=
1
n
y
i
"
b
#
x
i
i
=
1
n
$
i
=
1
n
$
%
’
(
)
*
,
a
=
y
i
#
x
i
"
b
#
x
i
2
i
=
1
n
$
i
=
1
n
$
x
i
i
=
1
n
$
(3)
giving