This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: MIT OpenCourseWare http://ocw.mit.edu 16.323 Principles of Optimal Control Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms . 16.323 Lecture 1 Nonlinear Optimization • Unconstrained nonlinear optimization Line search methods • Figure by MIT OpenCourseWare. Spr 2008 16.323 1–1 Basics – Unconstrained • Typical objective is to minimize a nonlinear function F ( x ) of the parameters x . – Assume that F ( x ) is scalar x = arg min x F ( x ) ⇒ • Define two types of minima: – Strong : objective function increases locally in all directions A point x is a strong minimum of a function F ( x ) if a scalar δ > exists such that F ( x ) < F ( x + Δ x ) for all Δ x such that < Δ x ≤ δ – Weak : objective function remains same in some directions, and increases locally in other directions Point x is a weak minimum of a function F ( x ) if is not a strong minimum and a scalar δ > exists such that F ( x ) ≤ F ( x + Δ x ) for all Δ x such that < Δ x ≤ δ • Note that a minimum is a unique global minimum if the definitions hold for δ = ∞ . Otherwise these are local minima.21.510.5 0.5 1 1.5 2 1 2 3 4 5 6 x F(x) Figure 1.1: F ( x ) = x 4 − 2 x 2 + x + 3 with local and global minima June 18, 2008 Spr 2008 16.323 1–2 First Order Conditions • If F ( x ) has continuous second derivatives, can approximate function in the neighborhood of an arbitrary point using Taylor series: F ( x + Δ x ) ≈ F ( x ) + Δ x T g ( x ) + 1 Δ x T G ( x )Δ x + ... 2 ⎡ where g ∼ gradient of F and G ∼ second derivative of F ∂ 2 F ∂ 2 F ⎤ ⎡ ⎤ ⎤ ⎡ T ∂F ∂x 2 1 ··· ∂x 1 ∂x n x 1 ⎥ ⎦ ,G = ⎢ ⎢ ⎣ ⎥ ⎥ ⎦ ∂x 1 ∂F ⎢ ⎣ . . . ⎣ ⎦ . . . . . . . . . . . x = , g = = . ∂ x ∂ 2 F ∂ 2 F ∂F ∂x n ∂x n ∂x 1 ··· ∂x 2 n x n • Firstorder condition from first two terms (assume Δ x 1 ) – Given ambiguity of sign of the term Δ x T g ( x ) , can only avoid cost decrease F ( x + Δ x ) < F ( x ) if g ( x ) = 0 ⇒ Obtain further information from higher derivatives – g ( x ) = 0 is a necessary and suﬃcient condition for a point to be a stationary point – a necessary, but not suﬃcient condition to be a minima. – Stationary point could also be a maximum or a saddle point. June 18, 2008 Spr 2008 16.323 1–3 • Additional conditions can be derived from the Taylor expansion if we set g ( x ) = 0 , in which case: 1 F ( x + Δ x ) ≈ F ( x ) + Δ x T G ( x )Δ x + ... 2 – For a strong minimum, need Δ x T G ( x )Δ x > for all Δ x , which is suﬃcient to ensure that F ( x + Δ x ) > F ( x ) . – To be true for arbitrary Δ x = 0 , suﬃcient condition is that G ( x ) > (PD)....
View
Full Document
 Spring '08
 jonathanhow
 Optimization, SPR, line search, Rosenbrock, BFGS

Click to edit the document details