This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Today's Outline IE417: Nonlinear Programming: Lecture 4
Jeff Linderoth
Department of Industrial and Systems Engineering Lehigh University Review Rates of Convergence Overview of Line Search Methods 26th January 2006 Jeff Linderoth IE417:Lecture 4 Jeff Linderoth IE417:Lecture 4 Stuff We Learned Last Time Newton's Method Assumes a quadratic model of the function at x: ^ The rate of change of a (continuously differentiable) function f : Rn R in the direction d Rn at the point x is dT f (^) ^ x The rate of steepest descent of the function f : Rn R at x ^ is given by  f (^) x m(d) = f (^) + x
def 1 f (^)T d + dT x 2 2 f (^)d x The Newton Step is the minimizer of this model: dN =  If
2 f (^) x 2 f (^) x 1 f (^) x 0, then dN is a descent direction QuasiNewton Methods: Approximate 2 f (^) by some matrix x B that varies from iteration to iteration Jeff Linderoth IE417:Lecture 4 Jeff Linderoth IE417:Lecture 4 Try It. It's Free In Our Next Episode... Rates of Convergence of Line Search Methods Problem Time! NEOSNetwork Enabled Optimization System. wwwneos.mcs.anl.gov An easy interface to allow users to solve their numerical optimization problems with remote resources. Problem can be specified in 22 different formats.
AMPL, Gams are most popular. Homework! Turn in: 3.1 Try: 3.23.8, 3.10 Extra Credit: Use the NMTR method at http: //neos.mcs.anl.gov/neos/solvers/uco:NMTR/C.html to minimize the Rosenbrock function and compare results from 3.1 Started in 1994. email interface to Server, based on netlib September 1995, Version 1. February 2002, Version 4. Kestrel interface.
Jeff Linderoth IE417:Lecture 4 Jeff Linderoth IE417:Lecture 4 ...
View
Full Document
 Spring '08
 Linderoth
 Calculus, Derivative, Optimization, Systems Engineering, Newton's method in optimization, Jeff Linderoth

Click to edit the document details