6.252 NONLINEAR PROGRAMMING
LECTURE 9: FEASIBLE DIRECTION METHODS
LECTURE OUTLINE
Conditional Gradient Method
Gradient Projection Methods
A feasible direction at an x X is a vector d = 0
such that x
6.252 NONLINEAR PROGRAMMING
LECTURE 8
OPTIMIZATION OVER A CONVEX SET;
OPTIMALITY CONDITIONS
Problem: minxX f (x), where:
(a) X
n
is nonempty, convex, and closed.
(b) f is continuously differentiable
6.252 NONLINEAR PROGRAMMING
LECTURE 2
UNCONSTRAINED OPTIMIZATION OPTIMALITY CONDITIONS
LECTURE OUTLINE
Unconstrained Optimization
Local Minima
Necessary Conditions for Local Minima
Sufcient Condit
LECTURE SLIDES ON NONLINEAR PROGRAMMING
BASED ON LECTURES GIVEN AT THE
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
CAMBRIDGE, MASS
DIMITRI P. BERTSEKAS
These lecture slides are based on the book:
Nonlinear
6.252 NONLINEAR PROGRAMMING
LECTURE 6
NEWTON AND GAUSS-NEWTON METHODS
LECTURE OUTLINE
Newtons Method
Convergence Rate of the Pure Form
Global Convergence
Variants of Newtons Method
Least Squares
6.252 NONLINEAR PROGRAMMING
LECTURE 5: RATE OF CONVERGENCE
LECTURE OUTLINE
Approaches for Rate of Convergence Analysis
The Local Analysis Method
Quadratic Model Analysis
The Role of the Condition
Problem Set 10 Solutions
May 14, 2005
5.5.2
The problem is
minimize 10x1 + 3x2
subject to 5x1 + x2 4,
x1 , x2 cfw_0, 1.
In Exercise 5.1.2, we found that the dual optimal value is q = 8. Now, consider
Problem Set 6 Solutions
March 31, 2005
3.2.1
(a) First consider the problem
min x1 + x2
subject to x2 + x2 = 2.
1
2
Note that h(x) = 2x = 0 for all feasible x. Thus any feasible x is regular, and we c
Problem Set 5 Solutions
March 29, 2005
2.2.1
We have
x1
x2
f (x) =
.1x3 + .55
and
1 0 0
2
f (x) = 0 1 0 .
0 0 1
Since the Hessian is positive denite for all x X, f (x) is convex over the set X. Thus
Problem Set 8 Solutions
April 26, 2005
4.2.1
We consider the problem
1
minimize f (x) = (x2 x2 ) 3x2
2
2 1
subject to x2 = 0.
(a) We have
1
L(x, ) = (x2 x2 ) 3x2 + x2 ,
2
2 1
so
x L(x, l)
=
x1
x2 3 +
Problem Set 3 Solutions
March 3, 2005
1.3.4
Without loss of generality we assume that x = 0, so the iteration is written as
xk+1 = xk s(Qxk + ek ) = (I sQ)xk sek .
Thus, we have
xk+1 (I sQ)xk + s ek q
Problem Set 4 Solutions
March 9, 2005
2.1.7
Consider the transformation of variables y = Q1/2 x and let w = Q1/2 z be the image of
the given vector z under this transformation. The problem is equivale
Problem Set 9 Solutions
April 25, 2005
5.2.1
The given problem can be rewritten as
m
minimize
fi (xi )
i=0
subject to xi Xi ,
xi = x0 ,
i = 0, 1, . . . , m
i = 1, 2, . . . m.
The dual function for thi
Problem Set 7 Solutions
April 5, 2005
3.1.9
(b) Let the line passing through the points a and b be the set of all x satisfying the equation
d x = c,
where d is a vector which can be taken to have unit
Problem Set 2 Solutions
February 20, 2005
1.1.6
(a) The cost function is convex so the necessary and sucient condition for optimality of
x is
m
x yi
wi
= 0,
x yi
i=1
which is the same as the conditio
Problem Set 1 Solutions
February 9, 2005
1.1.1
We have
2x + y + 1
2y + x + 2
f (x, y) =
Setting
f (x, y) = 0, we obtain the system of equations
2
2
x
y
=
1
2
.
This system has a unique solution (a u