This preview shows pages 1–9. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not in the set, there is a unique point in S with minimum distance from y . This allowed us to show that we can separate a convex set S from any point not in the set . Finally, we arrived at Farkas' Theorem which is at the heart of all optimization theory. Convex Functions Recall that if f :S R n is twicedifferentiable, then f is convex if and only if the Hessian of f is positive semidefinite at each point in S . If f is convex and S is a convex set, the point x * S is an optimal solution to the problem min x S f ( x ) if and only if f has a subgradient such that T ( x x * ) 0 2200 x S. Note that this is equivalent to there begin no improving, feasible directions . Hence, if S is open, then x * is an optimal solution if and only if there is a zero subgradient of f at x * . Characterizing Improving Directions Unconstrained Optimization Consider the unconstrained optimization problem min f ( x ) s.t. x X where X is an open set (typically R n ) . If f is differentiable at x * and there exists a vector d such that f ( x * ) T d < 0 , then d is an improving direction. If f ( x * ) T d > 0 2200 d R n , then there are no improving directions. Optimality Conditions Unconstrained Optimization If x * is a local minimum and f is differentiable at x * , then f ( x * ) = 0 and H( x * ) is positive semidefinite. If f is differentiable at x * , f ( x * ) = 0 , and H( x * ) is positive definite, then x * is a local minimum. If f is convex and x * is a local minimum, then x * is a global minimum. If f is strictly convex and x * is a local minimum, then x * is the unique global minimum. If f is convex and differentiable on the open set X , then x * X is a global minimum if and only if f ( x * ) = 0 . Constrained Optimization Now consider the constrained optimization problem min f ( x ) s.t. g i ( x ) 0 2200 i [1, m ] h i ( x ) = 0 2200 i [1, l ] x X where X is again an open set (typically R n ). Feasible and Improving Directions Constrained Optimization Definition : Let S be a nonempty set in R n and let x * cl S . The cone of feasible directions of S at x * is given by D = { d : d 0 and x * + d S, 2200 (0, ), 5 > 0} Definition : Let S be a nonempty set in R n and let x * cl S . Given a function f : R n R , the cone of improving directions of f at x * is given by F = { d : f ( x * + d) < f ( x * ) , 2200 (0, ), 5 > 0} Necessary Conditions Constrained Optimization If x * is a local minimum, then F D = ....
View
Full
Document
This note was uploaded on 02/29/2008 for the course IE 417 taught by Professor Linderoth during the Spring '08 term at Lehigh University .
 Spring '08
 Linderoth

Click to edit the document details