6.252 NONLINEAR PROGRAMMING
LECTURE 9: FEASIBLE DIRECTION METHODS
LECTURE OUTLINE
Conditional Gradient Method
Gradient Projection Methods
A feasible direction at an x X is a vector d = 0
such that x + d is feasible for all suff. small > 0
x2
Feasible
di
6.252 NONLINEAR PROGRAMMING
LECTURE 8
OPTIMIZATION OVER A CONVEX SET;
OPTIMALITY CONDITIONS
Problem: minxX f (x), where:
(a) X
n
is nonempty, convex, and closed.
(b) f is continuously differentiable over X.
Local and global minima. If f is convex local
6.252 NONLINEAR PROGRAMMING
LECTURE 2
UNCONSTRAINED OPTIMIZATION OPTIMALITY CONDITIONS
LECTURE OUTLINE
Unconstrained Optimization
Local Minima
Necessary Conditions for Local Minima
Sufcient Conditions for Local Minima
The Role of Convexity
MATHEMATIC
LECTURE SLIDES ON NONLINEAR PROGRAMMING
BASED ON LECTURES GIVEN AT THE
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
CAMBRIDGE, MASS
DIMITRI P. BERTSEKAS
These lecture slides are based on the book:
Nonlinear Programming, Athena Scientic,
by Dimitri P. Bertsekas;
6.252 NONLINEAR PROGRAMMING
LECTURE 6
NEWTON AND GAUSS-NEWTON METHODS
LECTURE OUTLINE
Newtons Method
Convergence Rate of the Pure Form
Global Convergence
Variants of Newtons Method
Least Squares Problems
The Gauss-Newton Method
NEWTONS METHOD
xk+1
=
6.252 NONLINEAR PROGRAMMING
LECTURE 5: RATE OF CONVERGENCE
LECTURE OUTLINE
Approaches for Rate of Convergence Analysis
The Local Analysis Method
Quadratic Model Analysis
The Role of the Condition Number
Scaling
Diagonal Scaling
Extension to Nonquad
6.252 NONLINEAR PROGRAMMING
LECTURE 4
CONVERGENCE ANALYSIS OF GRADIENT METHODS
LECTURE OUTLINE
Gradient Methods - Choice of Stepsize
Gradient Methods - Convergence Issues
CHOICES OF STEPSIZE I
Minimization Rule: k is such that
f (xk + k dk ) = min f (x
Problem Set 10 Solutions
May 14, 2005
5.5.2
The problem is
minimize 10x1 + 3x2
subject to 5x1 + x2 4,
x1 , x2 cfw_0, 1.
In Exercise 5.1.2, we found that the dual optimal value is q = 8. Now, consider the constraintrelaxed problem
minimize 10x1 + 3x2
subje
Problem Set 6 Solutions
March 31, 2005
3.2.1
(a) First consider the problem
min x1 + x2
subject to x2 + x2 = 2.
1
2
Note that h(x) = 2x = 0 for all feasible x. Thus any feasible x is regular, and we can apply
the Lagrange Multiplier Theorem. We have
L(x ,
Problem Set 5 Solutions
March 29, 2005
2.2.1
We have
x1
x2
f (x) =
.1x3 + .55
and
1 0 0
2
f (x) = 0 1 0 .
0 0 1
Since the Hessian is positive denite for all x X, f (x) is convex over the set X. Thus
satisfying the rst order necessary condition is sucient
Problem Set 8 Solutions
April 26, 2005
4.2.1
We consider the problem
1
minimize f (x) = (x2 x2 ) 3x2
2
2 1
subject to x2 = 0.
(a) We have
1
L(x, ) = (x2 x2 ) 3x2 + x2 ,
2
2 1
so
x L(x, l)
=
x1
x2 3 +
0
0
=
,
L(x, l)
= x2 = 0.
The only candidate for opti
Problem Set 3 Solutions
March 3, 2005
1.3.4
Without loss of generality we assume that x = 0, so the iteration is written as
xk+1 = xk s(Qxk + ek ) = (I sQ)xk sek .
Thus, we have
xk+1 (I sQ)xk + s ek q xk + s.
Applying sequentially this inequality, we obta
Problem Set 4 Solutions
March 9, 2005
2.1.7
Consider the transformation of variables y = Q1/2 x and let w = Q1/2 z be the image of
the given vector z under this transformation. The problem is equivalent to the problem of
projecting w on the closed convex
Problem Set 9 Solutions
April 25, 2005
5.2.1
The given problem can be rewritten as
m
minimize
fi (xi )
i=0
subject to xi Xi ,
xi = x0 ,
i = 0, 1, . . . , m
i = 1, 2, . . . m.
The dual function for this problem is given by
m
q(1 , . . . , m ) =
min
cfw_
xi
Problem Set 7 Solutions
April 5, 2005
3.1.9
(b) Let the line passing through the points a and b be the set of all x satisfying the equation
d x = c,
where d is a vector which can be taken to have unit norm. The projection of a vector x on
this line (see E
Problem Set 2 Solutions
February 20, 2005
1.1.6
(a) The cost function is convex so the necessary and sucient condition for optimality of
x is
m
x yi
wi
= 0,
x yi
i=1
which is the same as the condition for the equilibrium of forces in the Varignon frame m
Problem Set 1 Solutions
February 9, 2005
1.1.1
We have
2x + y + 1
2y + x + 2
f (x, y) =
Setting
f (x, y) = 0, we obtain the system of equations
2
2
x
y
=
1
2
.
This system has a unique solution (a unique stationary point) except when
2 = 4.
If 2 = 4, i