Unformatted text preview: Michael T. Heath 36 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Example: Newton’s Method Do not explicitly invert Hessian matrix, but instead solve
linear system Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization In this case, alternative descent direction can be
computed, such as negative gradient or direction of
negative curvature, and then perform line search
39 / 74 Michael T. Heath Scientiﬁc Computing 40 / 74 Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Trust Region Methods Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Trust Region Methods, continued Alternative to line search is trust region method, in which
approximate solution is constrained to lie within region
where quadratic model is sufﬁciently accurate
If current trust radius is binding, minimizing quadratic
model function subject to this constraint may modify
direction as well as length of Newton step
Accuracy of quadratic model is assessed by comparing
actual decrease in objective function with that predicted by
quadratic model, and trust radius is increased or
decreased accordingly Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing 41 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization QuasiNewton Methods 42 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Could use Broyden’s method to seek zero of gradient, but
this would not preserve symmetry of Hessian matrix Many variants of Newton’s method improve reliability and
reduce overhead Several secant updating formulas have been developed for
minimization that not only preserve symmetry in
approximate Hessian matrix, but also preserve positive
deﬁniteness QuasiNewton methods have form
−
xk+1 = xk − αk Bk 1 f (xk ) where αk is line search parameter and Bk is approximation
to Hessian matrix Symmetry reduces amount of work required by about half,
while positive deﬁniteness guarantees that quasiNewton
step will be descent direction Many quasiNewton methods are more robust than
Newton’s method, are superlinearly convergent, and have
lower overhead per iteration, which often more than offsets
their slower convergence rate
Michael T. Heath Scientiﬁc Computing Secant Updating Methods Newton’s method costs O(n3 ) arithmetic and O(n2 ) scalar
function evaluations per iteration for dense problem Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing 43 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Michael T. Heath
Optimization Problems
One...
View
Full
Document
This note was uploaded on 10/16/2011 for the course MECHANICAL 581 taught by Professor Wasfy during the Fall '11 term at IUPUI.
 Fall '11
 Wasfy
 Mechanical Engineering

Click to edit the document details