EE103, HW5, Winter 2009, Prof. S. E. Jacobsen
Page 1 / 5
EE103 Applied Numerical Computing, Winter 2009
HW 5
Due: 03/03/09
Your HW answers must contain your ID, Last Name, First Name, and the number of the
Discussion Section in which you are enrolled.
Class:
The purpose of HWs is absolutely NOT for the mere assignment of grades.
The
purpose of HW assignments is to engage the student in the process of learning the material
of the course and the material that the instructor believes to be instructive.
As such, your
work should be your own and reflect your effort in the learning process.
As a result, you
are not to consult the notes, HWs, HW solutions, etc., of other courses or past offerings of
this course.
Of course, you may discuss the material of this course with other students of
this class; however, your work must be your own.
Prob 1 (Continuation of Prob 3 (e) of HW 4):
For the given nonlinear least squares problem under consideration, we can calculate the
Hessian and, as a result, we should be able to implement Newton’s method, rather than
GN, and possibly solve for the optimal parameters,
**
12
,
x
x
.
(a)
You are to use the Newton code, NewtonMin.m, that has been placed at the website.
If you wish, you may modify the
output
format.
The code, NewtonMin.m, requires the
use of the Hessian.
In order to use the code, you’ll need to write your function
subroutine, ‘hw5p1.m’.
It must take the following form:
[ ] ( )
51 ,,
func
t
ion fxgxHx
hw p xuv
=
where
(, )
t
x
xx
=
is the initial vector,
(,)
uv
are the observations,
fx
is the value of the
function at termination of Newton’s method, and similarly for the gradient and Hessian at
termination,
,
gx Hx
.
Usage of NewtonMin.m:
[,
,
]
(
,
,
,
,
)
function xMin gx Hx
NewtonMin f x tol u v
=
where
'5
1
'
f
hw p
=
, and
16
tol
e
=−
(wherever
""
feval
appears in NewtonMin.m,
you’ll have to include
,
in the input; e.g.,
( ,
, , )
fxnew
feval f xnew u v
=
.
Use “diary”
to list your ‘hw5p1.m’, as well as your output and other comments you wish to state.
(b)
The following is a graph of the function near to what appears to be an optimal point.
By inspection, is it the case that the Hessian is PD at all points on the surface of the
graph?
If not, initiate NewtonMin.m at a point, on the graph, where the Hessian is not
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentEE103, HW5, Winter 2009, Prof. S. E. Jacobsen
Page 2 / 5
PD.
Does the method converge to an optimal point?
What property of the code is related
to your answer to this last question?
Use “diary”.
Prob 2 (Partial Tutorial):
Tutorial:
Following Lecture Slides 8A, the NewtonCotes approach to numerically approximating
the integral of
f
on the interval [ , ]
ab
is
1
bb
m
aa
f xdx
P
xdx
−
≈
∫∫
()
where
m
equally spaced samples for
f
x
are taken and an
(1
)
m
−
degree interpolating
polynomial,
1
m
Px
−
, is integrated on the interval [ , ]
.
Note the following, by referring to Lecture Slides 8A:
(1) When
2
m
=
, we see that the NC approximation to the integral can be stated as
1
( ()
)
2
b
m
a
ba
d
x
f
a
f
b
−
−
=+
∫
(2) When
3
m
=
, we see that the NC approximation to the integral can be stated as
1
4
6
b
m
a
d
x
f
a
f
c
f
b
−
−
+
∫
[()
]
This type of pattern continues for larger values of
m
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '09
 JACOBSEN
 Numerical Analysis, Prof SEJ, S. E. Jacobsen, Prof. S. E.

Click to edit the document details