non_convex_prob

non_convex_prob - Issues in Non-Convex Optimization Robert...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Issues in Non-Convex Optimization Robert M. Freund with assistance from Brian W. Anthony April 22, 2004 c ± 2004 Massachusetts Institute of Technology. 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
1 Outline General Nonlinear Optimization Problem Optimality Conditions for NLP Sequential Quadratic Programming (SQP) Method LOQO: Combining Interior-Point Methods and SQP Practical Issues in Solving NLP Problems 2 General Nonlinear Optimization Problem NLP: minimize x f ( x ) s.t. g i ( x )=0 , i ∈E g i ( x ) 0 , i ∈I n x ∈± , n ² n ² where f ( x ): ± →± , g i ( x ,i ∈E∪I , E denotes the indices of the equality constraints, and I denotes the indices of the inequality constraints. 2.1 General Comments Non-convex optimization problems arise in just about every economic and scientific domain: radiation therapy engineering product design economics: Nash equilibria finance: options pricing 2
Background image of page 2
industrial engineering: traffic equilibria, supply chain management many other domains as well Non-convex optimization is hard. Since x x 2 = 0 if and only if x ∈{ 0 , 1 } , we can formulate binary integer optimization as the following nonlinear optimization instance: T BIP: minimize x c x s.t. Ax b 2 =0 ,j =1 ,...,n x j x j n x ∈± 2.2 Useful Definitions The feasible region F of NLP is the set F = { x | g i ( x )=0 for i ∈E ,g i ( x ) 0for i ∈I} We have the following definitions of local/global, strict/non-strict min- ima/maxima. Definition 2.1 x ∈F is a local minimum of NLP if there exists ±> 0 such ¯ x ) f ( x ) for all x B that f x, ± ) ∩F . Definition 2.2 x is a global minimum of NLP if f ¯ x ) f ( x ) for all x . Definition 2.3 x is a strict local minimum of NLP if there exists 0 ¯ x ) <f ( x ) for all x B such that f x, ± ) , x ² x . Definition 2.4 x is a strict global minimum of NLP if f ¯ x ) ( x ) for all x , x ² x . 3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Definition 2.5 x ∈F is a local maximum of NLP if there exists ±> 0 ¯ x ) f ( x ) for all x B such that f x, ± ) ∩F . Definition 2.6 x is a global maximum of NLP if f ¯ x ) f ( x ) for all x . Definition 2.7 x is a strict local maximum of NLP if there exists ¯ x ) >f ( x ) for all x B 0 such that f x, ± ) , x ± x . Definition 2.8 x is a strict global maximum of NLP if f ¯ x ) ( x ) for all x , x ± x . If x is feasible for NLP, we let I ( x ) denote the indices of the active inequality constraints, namely: I ( x ):= { i ∈I | g i ( x )=0 } . 3 Optimality Conditions for NLP Theorem: Karush-Kuhn-Tucker Necessary Conditions. Suppose that f ( x )and g i ( x ) ,i ∈E∪I , are all differentiable functions. Under mild additional conditions, if ¯ x is a local minimum of NLP, then there exists y ¯ for which x )+ y ¯ i g i x ( i ) f x y ¯ i g i i ∈E i ∈I ( ii ) g i x ∈E ( iii ) g i x ) 0 ( iv y i 0 ( v ) y ¯ i · g i x . q.e.d. 4
Background image of page 4
± ± ± ± ² ³ In the absence of convexity, a KKT point can be a global minimum, a local minimum, a “saddlepoint”, or even a local or global maximum.
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 30

non_convex_prob - Issues in Non-Convex Optimization Robert...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online