non_convex_prob

# non_convex_prob - Issues in Non-Convex Optimization Robert...

This preview shows pages 1–6. Sign up to view the full content.

Issues in Non-Convex Optimization Robert M. Freund with assistance from Brian W. Anthony April 22, 2004 c ± 2004 Massachusetts Institute of Technology. 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1 Outline General Nonlinear Optimization Problem Optimality Conditions for NLP Sequential Quadratic Programming (SQP) Method LOQO: Combining Interior-Point Methods and SQP Practical Issues in Solving NLP Problems 2 General Nonlinear Optimization Problem NLP: minimize x f ( x ) s.t. g i ( x )=0 , i ∈E g i ( x ) 0 , i ∈I n x ∈± , n ² n ² where f ( x ): ± →± , g i ( x ,i ∈E∪I , E denotes the indices of the equality constraints, and I denotes the indices of the inequality constraints. 2.1 General Comments Non-convex optimization problems arise in just about every economic and scientiﬁc domain: radiation therapy engineering product design economics: Nash equilibria ﬁnance: options pricing 2
industrial engineering: traﬃc equilibria, supply chain management many other domains as well Non-convex optimization is hard. Since x x 2 = 0 if and only if x ∈{ 0 , 1 } , we can formulate binary integer optimization as the following nonlinear optimization instance: T BIP: minimize x c x s.t. Ax b 2 =0 ,j =1 ,...,n x j x j n x ∈± 2.2 Useful Deﬁnitions The feasible region F of NLP is the set F = { x | g i ( x )=0 for i ∈E ,g i ( x ) 0for i ∈I} We have the following deﬁnitions of local/global, strict/non-strict min- ima/maxima. Deﬁnition 2.1 x ∈F is a local minimum of NLP if there exists ±> 0 such ¯ x ) f ( x ) for all x B that f x, ± ) ∩F . Deﬁnition 2.2 x is a global minimum of NLP if f ¯ x ) f ( x ) for all x . Deﬁnition 2.3 x is a strict local minimum of NLP if there exists 0 ¯ x ) <f ( x ) for all x B such that f x, ± ) , x ² x . Deﬁnition 2.4 x is a strict global minimum of NLP if f ¯ x ) ( x ) for all x , x ² x . 3

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Deﬁnition 2.5 x ∈F is a local maximum of NLP if there exists ±> 0 ¯ x ) f ( x ) for all x B such that f x, ± ) ∩F . Deﬁnition 2.6 x is a global maximum of NLP if f ¯ x ) f ( x ) for all x . Deﬁnition 2.7 x is a strict local maximum of NLP if there exists ¯ x ) >f ( x ) for all x B 0 such that f x, ± ) , x ± x . Deﬁnition 2.8 x is a strict global maximum of NLP if f ¯ x ) ( x ) for all x , x ± x . If x is feasible for NLP, we let I ( x ) denote the indices of the active inequality constraints, namely: I ( x ):= { i ∈I | g i ( x )=0 } . 3 Optimality Conditions for NLP Theorem: Karush-Kuhn-Tucker Necessary Conditions. Suppose that f ( x )and g i ( x ) ,i ∈E∪I , are all diﬀerentiable functions. Under mild additional conditions, if ¯ x is a local minimum of NLP, then there exists y ¯ for which x )+ y ¯ i g i x ( i ) f x y ¯ i g i i ∈E i ∈I ( ii ) g i x ∈E ( iii ) g i x ) 0 ( iv y i 0 ( v ) y ¯ i · g i x . q.e.d. 4
± ± ± ± ² ³ In the absence of convexity, a KKT point can be a global minimum, a local minimum, a “saddlepoint”, or even a local or global maximum.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## non_convex_prob - Issues in Non-Convex Optimization Robert...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online