CS205 – Class 13
Covered in class:
1, 3
Readings:
6.7, 7.2 to 7.3.3, 7.4
1.
Constrained Optimization
a.
Minimize
()
f
x
G
subject to constraints
() 0
gx
=
G G
i.
Here
n
x
R
∈
G
and
=
GG
is as system of
mn
≤
equations
ii.
One can show that a solution
x
G
must satisfy
T
g
f
xJ
x
λ
−∇
=
G
G
G
1.
g
Jx
G
is the Jacobian matrix of g
2.
G
is an mvector of
Lagrange multipliers
3.
This condition says that we cannot reduce the objective function without
violating the constraints
iii.
Define
(, )
T
Lx
f x
λλ
=+
GGG
1.
The critical points are found by setting
0
T
g
fx J x
⎡⎤
∇+
∇
==
⎢⎥
⎣⎦
G
G
G
G
G
2.
Suppose for simplicity that g is a linear function.
Then the Hessian is
0
T
fg
g
Hx Jx
Hx
=
G
G
G
where the x partial derivatives of
T
g
G
G
vanish
because
g
is linear.
a.
Note that H is not positive definite
b.
It turns out that positive definiteness is only needed on the tangent
space to the constraint surface, i.e. on the null space of
g
J
.
iv.
Consider
22
12
() .
5
2
.
5
f
xx
x
with
1 0
x x
=
−−=
1.
( )
1
2
(, ) .
5
2
.
5
1
x
x
+
−
−
G
G
2.
1
2
5
0
1
x
x
+
∇=
−
=
−−
G
G
G
3.
so we solve
1
2
10 1
0
05 1
0
11
0
1
x
x
⎡
⎤
⎡
⎤
⎢
⎥
⎢
⎥
−=
⎢
⎥
⎢
⎥
⎢
⎥
⎢
⎥
−
⎣
⎦
⎣
⎦
to obtain
1
2
.833
.167
.833
x
x
⎡
⎤⎡
⎤
⎢
⎥⎢
⎥
=−
⎢
⎥
⎢
⎥
−
⎣
⎦⎣
⎦
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThe gradient of the function is perpendicular to the constraint surface at the
constrained minimum.
2.
Linear Programming
a.
Minimize
cx
⋅
GG
subject to constraints
A
xb
=
G
G
and
0
x
≥
G
G
b.
The feasible region is a convex polyhedron in ndimensional space
c.
The minimum must occur at one of the vertices of the polyhedron
d.
Simplex method
 systematically examine a sequence of vertices to find the one
yielding the minimum
3.
Interpolation
a.
This is the end of the preview. Sign up
to
access the rest of the document.
 Fall '07
 Fedkiw

Click to edit the document details