lec1 - MIT OpenCourseWare http://ocw.mit.edu 16.323...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: MIT OpenCourseWare http://ocw.mit.edu 16.323 Principles of Optimal Control Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms . 16.323 Lecture 1 Nonlinear Optimization Unconstrained nonlinear optimization Line search methods Figure by MIT OpenCourseWare. Spr 2008 16.323 11 Basics Unconstrained Typical objective is to minimize a nonlinear function F ( x ) of the parameters x . Assume that F ( x ) is scalar x = arg min x F ( x ) Define two types of minima: Strong : objective function increases locally in all directions A point x is a strong minimum of a function F ( x ) if a scalar > exists such that F ( x ) < F ( x + x ) for all x such that < x Weak : objective function remains same in some directions, and increases locally in other directions Point x is a weak minimum of a function F ( x ) if is not a strong minimum and a scalar > exists such that F ( x ) F ( x + x ) for all x such that < x Note that a minimum is a unique global minimum if the definitions hold for = . Otherwise these are local minima.-2-1.5-1-0.5 0.5 1 1.5 2 1 2 3 4 5 6 x F(x) Figure 1.1: F ( x ) = x 4 2 x 2 + x + 3 with local and global minima June 18, 2008 Spr 2008 16.323 12 First Order Conditions If F ( x ) has continuous second derivatives, can approximate function in the neighborhood of an arbitrary point using Taylor series: F ( x + x ) F ( x ) + x T g ( x ) + 1 x T G ( x ) x + ... 2 where g gradient of F and G second derivative of F 2 F 2 F T F x 2 1 x 1 x n x 1 ,G = x 1 F . . . . . . . . . . . . . . x = , g = = . x 2 F 2 F F x n x n x 1 x 2 n x n First-order condition from first two terms (assume x 1 ) Given ambiguity of sign of the term x T g ( x ) , can only avoid cost decrease F ( x + x ) < F ( x ) if g ( x ) = 0 Obtain further information from higher derivatives g ( x ) = 0 is a necessary and sucient condition for a point to be a stationary point a necessary, but not sucient condition to be a minima. Stationary point could also be a maximum or a saddle point. June 18, 2008 Spr 2008 16.323 13 Additional conditions can be derived from the Taylor expansion if we set g ( x ) = 0 , in which case: 1 F ( x + x ) F ( x ) + x T G ( x ) x + ... 2 For a strong minimum, need x T G ( x ) x > for all x , which is sucient to ensure that F ( x + x ) > F ( x ) . To be true for arbitrary x = 0 , sucient condition is that G ( x ) > (PD)....
View Full Document

Page1 / 18

lec1 - MIT OpenCourseWare http://ocw.mit.edu 16.323...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online