{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture11-2003

# Lecture11-2003 - Lecture XI Quasi-Newton Methods of...

This preview shows pages 1–2. Sign up to view the full content.

1 Lecture XI Quasi-Newton Methods of Optimization I. A Baseline Scenario A. In this lecture, we develop several alternatives to the Newton-Raphson algorithm. As starting point, I want to discuss a prototype algorithm. Algorithm U (Model algorithm for n-dimensional unconstrained minimization). Let x k be the current estimate of x * (the point which minimizes the objective function f ( x )). U1. [Test for convergence] If the conditions for convergence are satisfied, the algorithm terminates with x k as the solution. U2. [Compute a search direction] Compute a non-zero n -vector p k , the direction of the search. U3. [Compute a step length] Compute a scalar a k , the step length, for which f ( x k + a k p k )< f ( x k ). U4. [Update the estimate of the minimum] Set x k +1 = x k + a k p k , k = k +1, and go back to step U1. B. Given the steps to the prototype algorithm, I want to develop a sample problem that we can compare the various algorithms against. Notebook Algorithm3.ma includes the numeric problem: max . . . . x x x x x st x x x x 1 2 2 3 3 4 4 1 1 2 3 4 100 2 3 = Using Newton-Raphson, the optimal point for this problem is found in 10 iterations using 1.23 seconds on the DEC Alpha. II. An Overview of Newton and Quasi-Newton Algorithms A. The Newton-Raphson methodology can be used in U2 in the prototype algorithm. Specifically, the search direction can be determined by: ( ) p f x f x k xx k x k = − ∇ 2 1 ( ) ( ) B. Quasi-Newton algorithms involve an approximation to the Hesian matrix. For example, we could replace the Hessian matrix with the negative of the identity matrix for the maximization problem. In this case the search direction would be: ( ) p I n f x k x k = − − ( ) ( ) where I ( n ) is the identity matrix conformable with the gradient vector. This replacement is referred to as the steepest descent method. In our sample problem, this methodology requires 990 iterations and 29.28 seconds on the DEC Alpha. This result highlights some interesting features regarding the Quasi-Newton methods: 1. The steepest descent method requires more overall iterations. In this example, the steepest descent method requires 99 times as many iterations as the Newton-Raphson method.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 6

Lecture11-2003 - Lecture XI Quasi-Newton Methods of...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online