22M:270 Optimization Techniques
Homework 1 due Friday Jan 27
1. Find all the critical points of the following function
f (x, y) = x4 x2 y + 2y2 2y.
Determine if they are local minima, local maxima, or saddle points, by looking at the Hessian matrices at t

22M:270 Optimization Techniques
Homework 1 answers
1. Find all the critical points of the following function
f (x, y) = x4 x2 y + 2y2 2y.
Determine if they are local minima, local maxima, or saddle points, by looking at
the Hessian matrices at the critica

22M:174 Optimization Techniques
Homework 2 due Friday Feb 3
1. Show using Taylor series that a function f : Rn R is convex if and only if
the Hessian matrix 2 f (x) is positive semi-denite for all x.
2. Implement a simple steepest descent algorithm with x

Bracketing phase
Given: 1 and ,
0 0
for i = 1, 2, . . .
evaluate (i )
if (i ) > (0) + c1 i (0) or
(i > 1 and (i ) (i1 ) then
return zoom (i1 , i )
evaluate (i )
if | (i )| c2 | (0)| then
return i
if (i ) 0 then
return zoom (i , i1 )
choose i+1 > i
end fo

Notes on the linear conjugate gradient method
1
The conjugacy property
Minimizing a convex quadratic function
1
f (x) = xT Ax bT x + c
2
is equivalent to solving the linear system
f (x) = Ax b = 0.
For positive denite A this can be done using a method ca

22M:174 Optimization Techniques
Homework 4 due Monday Mar 19
1. Suppose that A is a symmetric matrix. This problem is about nding a
number that guarantees that A + I is positive denite. The idea is based
on an estimate for eigenvalues called the Gershgori

22M:174 Optimization Techniques
Homework 3 due Friday Feb 24
1. Implement the standard Newton method for solving f (x) = 0 without line
search. Test it on the function f (x, y) = x4 x2 y + 2y2 2y with the starting
points (x, y) = (0.01, 0.1), (1, 1 ), and

22M:174 Optimization Techniques
Homework 2 answers
1. Show using Taylor series that a function f : Rn R is convex if and only if the
Hessian matrix 2 f (x) is positive semi-denite for all x.
Using Taylor series with integral 2nd order remainder,
1
f (y) =

Global optimization
Calculus-based methods are necessarily local methods: they do not guarantee to nd the global minimum unless the function is in a restricted category (convex, for example). However, users typically want to nd the global
minimum. Doing t

22M:174/22C:174
Optimization Techniques
Spring 2012
This course will cover theory and (computational) practice for dealing with
optimization problems with continuous (rather than discrete) variables, with and
without constraints. These problems have immen