slide8_handout-1

# for a quadratic minimization problem newtons method

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: l guess x0 Newton’s Method: 1. Find ∇f (x), ∇2 f (x) and ∇2 f (x)−1 2. xk +1 = xk − ∇2 f (xk )−1 ∇f (xk ) 3. If ￿xk +1 − xk ￿2 < ￿ return xk +1 . Otherwise, repeat step 2 38 / 45 Unconstrained Newton’s Method Example 2 2 Let f (x) = x1 + 2x2 + x1 x2 + 3x2 with x0 = (0, 0) and ￿ = .001. Minimixe f (x) Step 1: ￿ ∇ f ( x) = ∇ 2 f ( x) = ∇ 2 f ( x) − 1 2x1 + x2 4x 2 + x 1 + 3 ￿ 21 14 = 1 7 ￿ ￿ ￿ 4 −1 −1 2 ￿ 39 / 45 Unconstrained Newton’s Method Example 2 2 Minimize f (x) = x1 + 2x2 + x1 x2 + 3x2 First Iteration ￿ x0 = (0, 0) ￿ x1 ￿￿ ￿ ￿ 3 ￿ 4 −1 0 7 = (0, 0) − = −6 −1 2 3 7 Check for Convergence ￿ ￿￿￿ 23 − 6 0 1) = 7 7 ￿ ∇f ( x = −6 3 0 4 7 + 7 +3 ￿ Terminate algorithm! Converged with only 1 step 1 7 ￿ 40 / 45 Newton’s Method Convergence When Newton’s Method converges, it does so pretty quickly (quadratically or better.) ￿ For a quadratic minimization problem, Newton’s Method converges in just one step But Newton’s Met...
View Full Document

## This note was uploaded on 12/10/2013 for the course MS&E 211 taught by Professor Yinyuye during the Fall '07 term at Stanford.

Ask a homework question - tutors are online