Heath scientic computing 40 74 optimization problems

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Michael T. Heath 36 / 74 Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Example: Newton’s Method Do not explicitly invert Hessian matrix, but instead solve linear system Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Scientific Computing Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization In this case, alternative descent direction can be computed, such as negative gradient or direction of negative curvature, and then perform line search 39 / 74 Michael T. Heath Scientific Computing 40 / 74 Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Trust Region Methods Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Trust Region Methods, continued Alternative to line search is trust region method, in which approximate solution is constrained to lie within region where quadratic model is sufficiently accurate If current trust radius is binding, minimizing quadratic model function subject to this constraint may modify direction as well as length of Newton step Accuracy of quadratic model is assessed by comparing actual decrease in objective function with that predicted by quadratic model, and trust radius is increased or decreased accordingly Michael T. Heath Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Scientific Computing 41 / 74 Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Quasi-Newton Methods 42 / 74 Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Could use Broyden’s method to seek zero of gradient, but this would not preserve symmetry of Hessian matrix Many variants of Newton’s method improve reliability and reduce overhead Several secant updating formulas have been developed for minimization that not only preserve symmetry in approximate Hessian matrix, but also preserve positive definiteness Quasi-Newton methods have form − xk+1 = xk − αk Bk 1 f (xk ) where αk is line search parameter and Bk is approximation to Hessian matrix Symmetry reduces amount of work required by about half, while positive definiteness guarantees that quasi-Newton step will be descent direction Many quasi-Newton methods are more robust than Newton’s method, are superlinearly convergent, and have lower overhead per iteration, which often more than offsets their slower convergence rate Michael T. Heath Scientific Computing Secant Updating Methods Newton’s method costs O(n3 ) arithmetic and O(n2 ) scalar function evaluations per iteration for dense problem Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Michael T. Heath Optimization Problems One-Dimensional Optimization Multi-Dimensional Optimization Scientific Computing 43 / 74 Unconstrained Optimization Nonlinear Least Squares Constrained Optimization Michael T. Heath Optimization Problems One...
View Full Document

Ask a homework question - tutors are online