This preview shows page 1. Sign up to view the full content.
Unformatted text preview: active
set strategy If x∗ is solution to
ρ
min φρ (x) = f (x) + 1 ρ g (x)T g (x)
2 Inequality constraints are provisionally divided into those
that are satisﬁed already (and can therefore be temporarily
disregarded) and those that are violated (and are therefore
temporarily treated as equality constraints) x then under appropriate conditions
lim x∗ = x∗
ρ ρ→∞ This enables use of unconstrained optimization methods,
but problem becomes illconditioned for large ρ, so we
solve sequence of problems with gradually increasing
values of ρ, with minimum for each problem used as
starting point for next problem
< interactive example > This division of constraints is revised as iterations proceed
until eventually correct constraints are identiﬁed that are
binding at solution Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing 65 / 74 Michael T. Heath Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Barrier Methods Scientiﬁc Computing 66 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Example: Constrained Optimization For inequalityconstrained problems, another alternative is
barrier function, such as
p φµ (x) = f (x) − µ
i=1 or Consider quadratic programming problem
min f (x) = 0.5x2 + 2.5x2
1
2 1
hi (x) x subject to
g (x) = x1 − x2 − 1 = 0 p φµ (x) = f (x) − µ Lagrangian function is given by log(−hi (x)) L(x, λ) = f (x) + λ g (x) = 0.5x2 + 2.5x2 + λ(x1 − x2 − 1)
1
2 i=1 which increasingly penalize feasible points as they
approach boundary of feasible region
Again, solutions of unconstrained problem approach x∗ as
µ → 0, but problems are increasingly illconditioned, so
solve sequence of problems with decreasing values of µ
Barrier functions are basis for interior point methods for
linear programming
Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Scientiﬁc Computing Since
f (x) = x1
5x2 and Jg (x) = 1 −1 we have
x L(x, λ)
67 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization = T
f (x) + Jg (x)λ = Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Example, continued x1
1
+λ
5x2
−1 Scientiﬁc Computing 68 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Example, continued So system to be solved for critical point of Lagrangian is
x1 + λ = 0
5x2 − λ = 0
x1 − x2 = 1
which in this case is linear system 1
0
1
x1
0
0
5 −1 x2 = 0
1 −1
0
λ
1
Solving this system, we obtain solution
x1 = 0.833, x2 = −0.167, Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization λ = −0.833 Scientiﬁc Computing 69 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Michael T. Heath
Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Linear Programming Scientiﬁc Computing 70 / 74 Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Linear Programming, continued
Simplex method is reliable and normally efﬁcient, able to
solve problems with thousands of variables, but can
require time exponential in size of problem in worst case One of most important and common constrained
optimization problems is linear programming
One standard form for such problems is
min f (x) = cT x subject to Ax = b and Interior point methods for linear programming developed in
recent years have polynomial worst case solution time x≥0 where m < n, A ∈ Rm×n , b ∈ Rm , and c, x ∈ Rn These methods move through interior of feasible region,
not restricting themselves to investigating only its vertices Feasible region is convex polyhedron in Rn , and minimum
must occur at one of its vertices Although interior point methods have signiﬁcant practical
impact, simplex method is still predominant method in
standard packages for linear programming, and its
effectiveness in practice is excellent Simplex method moves systematically from vertex to
vertex until minimum point is found
Michael T. Heath Scientiﬁc Computing 71 / 74 Michael T. Heath Scientiﬁc Computing 72 / 74 Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Optimization Problems
OneDimensional Optimization
MultiDimensional Optimization Example: Linear Programming Unconstrained Optimization
Nonlinear Least Squares
Constrained Optimization Example, continued To illustrate linear programming, consider
min = cT x = −8x1 − 11x2
x subject to linear inequality constraints
5x1 + 4x2 ≤ 40, −x1 + 3x2 ≤ 12, x1 ≥ 0, x2 ≥ 0 Minimum value must occur at vertex of feasible region, in
this case at x1 = 3.79, x2 = 5.26, where objective function
has value −88.2 Michael T. Heath Scientiﬁc Computing 73 / 74 Michael T. Heath Scientiﬁc Computing 74 / 74...
View
Full
Document
 Fall '11
 Wasfy
 Mechanical Engineering

Click to edit the document details