This preview shows page 1. Sign up to view the full content.
Unformatted text preview: hod won’t always converge.
No convergence if for some iterate xn the Jacobian ∇g (xn ) is
singular (because this would be equivalent to requiring us to
divide by zero)
Modiﬁed Newton’s Method from lecture notes can help Sometimes no convergence from our starting point x0 if x0 is
too far from the root
In practice, it’s a good idea to set a maximum number of
iterations so your algorithm doesn’t go forever 41 / 45 HW5 Q1 - Gradient Descent and Newton’s Method
(a) Use the theorem on steepest gradient descent convergence,
and remember to check for convexity to decide if it’s a global
or local optimum
(b) Perform just the ﬁrst step of the steepest gradient descent
method and Newton’s method. To determine which will
converge faster, recall what we know about the convergence of
Newton’s method for quadratic problems!
(c) To determine when Newton’s method will converge, use the
fact that we already know when g (x ) = x α = 0 (at x = 0).
Newton’s method con...
View Full Document
This note was uploaded on 12/10/2013 for the course MS&E 211 taught by Professor Yinyuye during the Fall '07 term at Stanford.
- Fall '07