FinalReview - Final Review IE417 In the Beginning In the...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Final Review IE417 In the Beginning... In the beginning, Weierstrass's theorem said that a continuous function achieves a minimum on a compact set. Using this, we showed that for a convex set S and y not in the set, there is a unique point in S with minimum distance from y . This allowed us to show that we can separate a convex set S from any point not in the set . Finally, we arrived at Farkas' Theorem which is at the heart of all optimization theory. Convex Functions Recall that if f :S → R n is twice-differentiable, then f is convex if and only if the Hessian of f is positive semidefinite at each point in S . If f is convex and S is a convex set, the point x * ∈ S is an optimal solution to the problem min x ∈ S f ( x ) if and only if f has a subgradient ξ such that ξ T ( x- x * ) ≥ 0 2200 x ∈ S. Note that this is equivalent to there begin no improving, feasible directions . Hence, if S is open, then x * is an optimal solution if and only if there is a zero subgradient of f at x * . Characterizing Improving Directions Unconstrained Optimization Consider the unconstrained optimization problem min f ( x ) s.t. x ∈ X where X is an open set (typically R n ) . If f is differentiable at x * and there exists a vector d such that ∇ f ( x * ) T d < 0 , then d is an improving direction. If ∇ f ( x * ) T d > 0 2200 d ∈ R n , then there are no improving directions. Optimality Conditions Unconstrained Optimization If x * is a local minimum and f is differentiable at x * , then ∇ f ( x * ) = 0 and H( x * ) is positive semi-definite. If f is differentiable at x * , ∇ f ( x * ) = 0 , and H( x * ) is positive definite, then x * is a local minimum. If f is convex and x * is a local minimum, then x * is a global minimum. If f is strictly convex and x * is a local minimum, then x * is the unique global minimum. If f is convex and differentiable on the open set X , then x * ∈ X is a global minimum if and only if ∇ f ( x * ) = 0 . Constrained Optimization Now consider the constrained optimization problem min f ( x ) s.t. g i ( x ) ≤ 0 2200 i ∈ [1, m ] h i ( x ) = 0 2200 i ∈ [1, l ] x ∈ X where X is again an open set (typically R n ). Feasible and Improving Directions Constrained Optimization Definition : Let S be a nonempty set in R n and let x * ∈ cl S . The cone of feasible directions of S at x * is given by D = { d : d ≠ 0 and x * + λ d ∈ S, 2200λ ∈ (0, δ ), 5δ > 0} Definition : Let S be a nonempty set in R n and let x * ∈ cl S . Given a function f : R n → R , the cone of improving directions of f at x * is given by F = { d : f ( x * + λ d) < f ( x * ) , 2200λ ∈ (0, δ ), 5δ > 0} Necessary Conditions Constrained Optimization If x * is a local minimum, then F ∩ D = ∅ ....
View Full Document

{[ snackBarMessage ]}

Page1 / 37

FinalReview - Final Review IE417 In the Beginning In the...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online