This preview shows page 1. Sign up to view the full content.
Unformatted text preview: widely
separated starting points and see if all produce same
result x →∞ i.e., f (x) must be large whenever x is large For some problems, such as linear programming, global
optimization is more tractable Michael T. Heath Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing If f is coercive on closed, unbounded set S ⊆ Rn , then f
has global minimum on S
7 / 74 Michael T. Heath Scientiﬁc Computing 8 / 74 Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Deﬁnitions
Existence and Uniqueness
Optimality Conditions Level Sets Deﬁnitions
Existence and Uniqueness
Optimality Conditions Uniqueness of Minimum Level set for function f : S ⊆ Rn → R is set of all points in
S for which f has some given constant value Set S ⊆ Rn is convex if it contains line segment between
any two of its points For given γ ∈ R, sublevel set is Function f : S ⊆ Rn → R is convex on convex set S if its
graph along any line segment in S lies on or below chord
connecting function values at endpoints of segment Lγ = {x ∈ S : f (x) ≤ γ }
If continuous function f on S ⊆ Rn has nonempty sublevel
set that is closed and bounded, then f has global minimum
on S Any local minimum of convex function f on convex set
S ⊆ Rn is global minimum of f on S
Any local minimum of strictly convex function f on convex
set S ⊆ Rn is unique global minimum of f on S If S is unbounded, then f is coercive on S if, and only if, all
of its sublevel sets are bounded Scientiﬁc Computing Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization 9 / 74 Michael T. Heath Deﬁnitions
Existence and Uniqueness
Optimality Conditions Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization FirstOrder Optimality Condition For twice continuously differentiable f : S ⊆ Rn → R, we
can distinguish among critical points by considering
Hessian matrix Hf (x) deﬁned by Generalization to function of n variables is to ﬁnd critical
point, i.e., solution of nonlinear system {Hf (x)}ij = f (x) = 0
where f (x) is gradient vector of f , whose ith component
is ∂f (x)/∂xi At critical point x∗ , if Hf (x∗ ) is
positive deﬁnite, then x∗ is minimum of f
negative deﬁnite, then x∗ is maximum of f
indeﬁnite, then x∗ is saddle point of f
singular, then various pathological situations are possible But not all critical points are minima: they can also be
maxima or saddle points
Scientiﬁc Computing Michael T. Heath 11 / 74 Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Deﬁnitions
Existence and Uniqueness
Optimality Conditions Constrained Optimality Scientiﬁc Computing 12 / 74 Deﬁnitions
Existence and Uniqueness
Optimality Conditions Constrained Optimality, continued If problem is constrained, only feasible directions are
relevant Lagrangian function L : Rn+m → R, is deﬁned by
L(x, λ) = f (x) + λT g (x) For equalityconstrained problem Its gradient is given by min f (x) subject to g (x) = 0
Rn Rn Rm , where f :
→ R and g :
→
with m ≤ n, necessary
condition for feasible point x∗ to be solution is that negative
gradient of f lie in space spanned by constraint normals,
− ∂ 2 f (...
View
Full
Document
 Fall '11
 Wasfy
 Mechanical Engineering

Click to edit the document details