# Lecture-4 - Lecture-4 Line Search Methods: Search...

This preview shows pages 1–4. Sign up to view the full content.

1 Lecture-4 Line Search Methods: Search Directions, and step lengths Line Search Methods k k k k p x x a + + 1 k k k f B p - - 1 Steepest descent is and identity matrix Newton is a Hessian matrix Quasi-Newton is approximation to the Hessian matrix k B k B k B

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 k k k f H p - = 1 , ) ( ) ( 1 k T k k T k k k T k k k k T k k k k s y s s y s I H y s I H = + - - = + r r r r 1 - = k k B H Instead of inverting approximation of Hessian, we can directly compute the approximation of inverse of Hessian: Inverse Hessian Quasi Newton k k k k k f f x x s - = - = + + 1 k 1 y , Conjugate Gradient 1 ) ( - + -∇ = k k k k p x f p b is scalar such that and are conjugate k p 1 - k p 0 1 = - k T k Gp p Two vectors are conjugate with respect to a matrix G if k b Non-interfering directions, with the special property that minimization along one direction is not spoiled by subsequent minimization along another.
3 Step Length 0 ) ( ) ( + = a a a f k k p x f ( Exact Search ) The global minimizer of the univariate function: Too many evaluations of a function, and its gradient ( In-exact search ): adequate reduction in f at minimal cost. Two step method:

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 06/12/2011 for the course COT 6505 taught by Professor Shah during the Spring '07 term at University of Central Florida.

### Page1 / 10

Lecture-4 - Lecture-4 Line Search Methods: Search...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online