Distributed Recovery/Regression/Classification
using ADMM
By being very crafty with how we do the splitting, we can use ADMM
to solve certain kinds of optimization programs in a distributed manner.
We consider (this material comes from [BPC+10, Sec. 8]) t
ECE 8823 (Convex Optimization), Spring 2017
Homework #1
Due Monday January 23, in class
Reading: B&V, Chapter 1. You might also want to skim Appendix A.
Please sign up for Piazza at https:/piazza.com/gatech/spring2017/ece8823b.
1. Using you class notes, p
Unconstrained minimization of smooth functions
We will start our discussion about solving convex optimization programs by considering the unconstrained case. Our template problem
is
minimize
f (x),
(1)
N
xR
where f is convex. While we state this problem a
A first look at duality
It is time for our first look at one of the most important concepts in
convex optimization (and all of convex analysis in general): duality.
With the separating and supporting hyperplane theorems, we have
had a glimpse of how hyper
Convex functions
The domain dom f of a functional f : RN R is the subset of
RN where f is well-defined.
A function(al) f is convex if dom f is a convex set, and
f (x + (1 )y) f (x) + (1 )f (y)
for all x, y dom f and 0 1.
f is concave if f is convex.
f is
A second look at duality
Believe it or not, we now know enough to explore one of the key
concepts in convex optimization: duality. In our first look at duality,
we saw how we could recast a particular problem, find the closest
point in a convex set to a f
ECE 8823 (Convex Optimization), Spring 2015
Homework #2
Due Monday February 6, in class
Reading: Boyd and Vandenberghe, Chapters 2 and 3
1. Using you class notes, prepare a 1-2 paragraph summary of what we talked about
in class in the last week. I do not
ECE 8823 (Convex Optimization), Spring 2017
Grading Rubric
1. Each homework is graded on a scale of 100 points.
2. Each problem, including the written paragraph, is worth an equal number of points.
If there are N problems, then each problem is worth floor
ECE 8823 (Convex Optimization), Spring 2017
Homework #6
Due Wednesday April 5, in class
Reading:
B&V Chapters 4, 6 and 7;
Boyd et al, Distributed Optimization and Stat. Learning using ADMM, 2011
Along with 1, the assignment consists of 5 questions chosen
ECE 8823 CVXOPT: Final project poster guidelines
The final project posters will be presented on Friday April 28. Their will be two poster sessions: the
first runs from 3:00p until 4:15p, the second runs from 4:30p until 5:45p. Projects will be assigned
to
ECE 8823 (Convex Optimization), Spring 2017
Course Project
The course project is worth 50% of your final grade, and will involve an in-depth investigation of
a topic of your choice. The project can either involve advanced study of a theoretical concept of
ECE 8823 CVXOPT: Final project report guidelines
The final report for the project should be submitted by 11:59pm, Monday, May 1 (together
with the pdf copy of your poster). The final report is worth 40% of your final project
grade.
The report and the pos
ECE 8823 (Convex Optimization), Spring 2017
Homework #4
Due Wednesday March 1, in class
Reading: B&V, Chapter 9
1. Using you class notes, prepare a 1-2 paragraph summary of what we talked about
in class in the last week. I do not want just a bulleted list
ECE 8823 (Convex Optimization), Spring 2017
Homework #5
Due Friday March 17, in class
Reading: B&V Chapter 5
1. Using you class notes, prepare a 1-2 paragraph summary of what we talked about
in class since the last homework. I do not want just a bulleted
Convex sets
In this section, we will be introduced to some of the mathematical
fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from several different
angles. The tools and concepts we de
A second look at duality
In our first look at duality, we saw how we could recast the closest
point problem as a maximization problem over the set of separating
hyperplanes. Key to this was the support functional, which basically
gave us a way to transfor
Convergence of Newtons Method
Suppose that f (x) is strongly convex,
mI 2f (x) mI,
x RN ,
and that its Hessian is Lipschitz,
k2f (x) 2f (y)k Lkx yk2.
(The norm on the left-hand side above is the standard operator
norm.) We will show that the Newton algori
Convex relaxation
The art and science of convex relaxation revolves around taking a
non-convex problem that you want to solve, and replacing it with
a convex problem which you can actually solve the solution to
the convex program gives information about (
Quasi-Newton Methods
A great resource for the material in this section is [NW06, Chapter
6].
Newtons method is great in that it converges to tremendous accuracy in a very surprisingly small number of iterations, especially for
smooth functions. It is not
Convex functions
The domain dom f of a functional f : RN R is the subset of
RN where f is well-defined.
A function(al) f is convex if dom f is a convex set, and
f (x + (1 )y) f (x) + (1 )f (y)
for all x, y dom f and 0 1.
f is concave if f is convex.
f is
Lagrange duality
Another way to arrive at the KKT conditions, and one which gives
us some insight on solving constrained optimization problems, is
through the Lagrange dual. The dual is a maximization program in
, it is always concave (even when the origi
The Karush-Kuhn-Tucker (KKT) conditions
In this section, we will give a set of sufficient (and at most times necessary) conditions for a x? to be the solution of a given convex optimization problem. These are called the Karush-Kuhn-Tucker (KKT)
conditions
`1 minimization
We will now focus on underdetermined systems of equations:
# samples
resolution/
bandwidth
=
data
acquisition
system
unknown
signal/image
Suppose we observe y = x0, and given y we attempt to estimate
x0 by applying the pseudo-inverse of to
Existence of minimizers
We have just talked a lot about how to find the minimizer of an
unconstrained convex optimization problem. We have not talked
too much, at least not in concrete mathematical terms, about the
conditions under which functionals achie
A first look at duality
It is time for our first look at one of the most important concepts in
convex optimization (and all of convex analysis in general): duality.
With the separating and supporting hyperplane theorems, we have
had a glimpse of how hyper
Algorithms for constrained optimization
There are as many constrained optimization algorithms as there are
pages in the SIAM Journal of Optimization. But many can categorized into a small number of basic frameworks.
Eliminating equality constraints
Progra
Alternating direction method of multipliers
(ADMM)
Again, this material is mostly pulled from [BPC+10]. I have uploaded
the paper to T-square so you can download it if you are interested.
ADMM extends the method of multipliers in such away that we
get bac
Introduction to Optimization
In its most general form, an optimization program
min f0(x)
x
subject to x X
searches for the vector x RN that minimizes a given functional
f0 : RN R over a set X RN .
We will rely on X to be specified by a series of constrain
Max flow, Min cut
Consider the following network.
3
2
6
2
4
1
5
6
7
2
2
3
6
2
5
The nodes are routers, the edges are communications links; associated with each node is a capacity node 1 can communicate to
node 2 at as much as 4 Mbps, node 2 can communicat
ECE 8823 CVXOPT: Final project proposal guidelines
The first main deliverable for the final project is the project proposal to be submitted in class
on Friday April 7. Each group must submit a hardcopy of a single proposal that contains the
following info