Lecture16

# Lecture16 - Advanced Mathematical Programming IE417 Lecture...

This preview shows pages 1–6. Sign up to view the full content.

Advanced Mathematical Programming IE417 Lecture 16 Dr. Ted Ralphs

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE417 Lecture 16 1 Reading for This Lecture Sections 8.6-8.8
IE417 Lecture 16 2 Method of Steepest Descent Up until now, we discussed methods that use only function evaluations. As before, if the objective function is diﬀerentiable, we can use the derivative to guide the search. Recall the direction of steepest descent at x * is -∇ f ( x ) . Method of steepest descent: Iteratively perform line searches in the direction of steepest descent. Because this is a line search algorithm, it will converge as long as f is continuous and diﬀerentiable.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE417 Lecture 16 3 Problems with this Algorithm This algorithm can have problems if the Hessian is ill-conditioned. This is essentially because the linear approximation is not good when the gradient is near zero. In this case, the error term in the approximation begins to dominate. In the worst case, the search path can zigzag wildly.
IE417 Lecture 16 4 Convergence Rate Suppose the Hessian has a condition number α . If

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 12

Lecture16 - Advanced Mathematical Programming IE417 Lecture...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online