15.084J Recitation Handout 10
Tenth Week in a Nutshell:
Importance of duality
Lagrangian dual approach
Features of the Dual
Column-geometry dual approach
Weak Duality
Strong Duality
Importance of Duality
In many problems, dual variables have useful interp
15.084J Recitation Handout 6
Sixth Week in a Nutshell:
Penalty/Barrier Methods
Quiz Review
Penalty Methods
How should we solve constrained optimization problems, given what we know of the unconstrained case?
Add a penalty to the infeasible region, and the
15.084J Recitation Handout 5
Fifth Week in a Nutshell:
When is KKT Necessary
Su cient Conditions
Steepest Descent for Constrained Problems
New De nitions
x is a slater point for a set of constraints if x is strictly feasible for all inequalities, and feas
15.084J Recitation Handout 4
Fourth Week in a Nutshell:
Separating Hyperplanes
Theorem of the Alternative (Farkas Lemma)
Necessary Conditions for Optimum of Constrained Problem
Finding Optima
Separating Hyperplanes
Main point: Two closed, convex, disjoint
15.084J Recitation Handout 3
Third Week in a Nutshell:
Method of Steepest Descent
Why this method is good
Why this method is bad
Line Search Algorithm
Method of Steepest Descent
In order to nd a minimum of f (x), take a point xk , nd the direction of stee
15.084J Recitation Handout 2
Second Week in a Nutshell:
Newton's Method
When Newton's Method Fails
Rates of Convergence
Quadratic Forms
Eigenvectors/Eigenvalues/Decompositions
New notation: Q is positive semide nite is written as Q 0 (and similarly for po
15.084J Recitation Handout 1
First Week in a Nutshell:
The Basic Problem
Basic De nitions
Weirstrass Theorems
Necessary and Su cient Conditions for Optimality
The Basic Problem:
minf (x) subject to x 2 S g(x) 0 h(x) = 0
Variations can be put into this for
Introduction to Semidenite Programming (SDP)
Robert M. Freund
March, 2004
2004 Massachusetts Institute of Technology.
1
1
Introduction
Semidenite programming (SDP ) is the most exciting development in mathematical programming in the 1990s. S DP has applic
Subgradient Optimization, Generalized
Programming, and Nonconvex Duality
Robert M. Freund
May, 2004
2004 Massachusetts Institute of Technology.
1
1
Subgradient Optimization
1.1 Review of Subgradients
Recall the following facts about subgradients of conve
Additional Homework Problems
Robert M. Freund
April, 2004
2004 Massachusetts Institute of Technology.
1
2
1
Exercises
1. Let IRn denote the nonnegative orthant, namely IRn = cfw_x IRn | xj
+
+
n
n
0, j = 1, . . . , n. Considering IRn as a cone, prove tha
Duality Theory of Constrained Optimization
Robert M. Freund
March, 2004
2004 Massachusetts Institute of Technology.
1
2
1
Overview
The Practical Importance of Duality
Denition of the Dual Problem
Steps in the Construction of the Dual Problem
Examp
Analysis of Convex Sets and Functions
Robert M. Freund
April, 2004
2004 Massachusetts Institute of Technology.
1
2
1
Convex Sets - Basics
A set S IRn is dened to be a convex set if for any x1 S , x2 S ,
and any scalar satisfying 0 1, x1 + (1 )x2 S . Point
Primal-Dual Interior-Point Methods for Linear
Programming
based on Newtons Method
Robert M. Freund
March, 2004
2004 Massachusetts Institute of Technology.
1
1
The Problem
The logarithmic barrier approach to solving a linear program dates back to
the work
Conditional Gradient Method, plus Subgradient
Optimization
Robert M. Freund
March, 2004
2004 Massachusetts Institute of Technology.
1
1 The Conditional-Gradient Method for Constrained
Optimization (Frank-Wolfe Method)
We now consider the following optimi
Penalty and Barrier Methods
for Constrained Optimization
Robert M. Freund
February, 2004
2004 Massachusetts Institute of Technology.
1
1
Introduction
Consider the constrained optimization problem P :
P : minimize f (x)
x
s.t.
gi (x) 0, i = 1, . . . , m
hi
Projection Methods for Linear Equality
Constrained Problems
Robert M. Freund
March, 2004
2004 Massachusetts Institute of Technology.
1
1
Review of Steepest Descent
Suppose we want to solve
P : minimize f (x)
x
s.t.
n,
where f (x) is dierentiable. At the p