Distributed Recovery/Regression/Classification
using ADMM
By being very crafty with how we do the splitting, we can use ADMM
to solve certain kinds of optimization programs in a distributed manner.
We
ECE 8823 (Convex Optimization), Spring 2017
Homework #1
Due Monday January 23, in class
Reading: B&V, Chapter 1. You might also want to skim Appendix A.
Please sign up for Piazza at https:/piazza.com/
Unconstrained minimization of smooth functions
We will start our discussion about solving convex optimization programs by considering the unconstrained case. Our template problem
is
minimize
f (x),
(1
A first look at duality
It is time for our first look at one of the most important concepts in
convex optimization (and all of convex analysis in general): duality.
With the separating and supporting
Convex functions
The domain dom f of a functional f : RN R is the subset of
RN where f is well-defined.
A function(al) f is convex if dom f is a convex set, and
f (x + (1 )y) f (x) + (1 )f (y)
for all
A second look at duality
Believe it or not, we now know enough to explore one of the key
concepts in convex optimization: duality. In our first look at duality,
we saw how we could recast a particular
ECE 8823 (Convex Optimization), Spring 2015
Homework #2
Due Monday February 6, in class
Reading: Boyd and Vandenberghe, Chapters 2 and 3
1. Using you class notes, prepare a 1-2 paragraph summary of wh
ECE 8823 (Convex Optimization), Spring 2017
Grading Rubric
1. Each homework is graded on a scale of 100 points.
2. Each problem, including the written paragraph, is worth an equal number of points.
If
ECE 8823 (Convex Optimization), Spring 2017
Homework #6
Due Wednesday April 5, in class
Reading:
B&V Chapters 4, 6 and 7;
Boyd et al, Distributed Optimization and Stat. Learning using ADMM, 2011
Along
ECE 8823 CVXOPT: Final project poster guidelines
The final project posters will be presented on Friday April 28. Their will be two poster sessions: the
first runs from 3:00p until 4:15p, the second ru
ECE 8823 (Convex Optimization), Spring 2017
Course Project
The course project is worth 50% of your final grade, and will involve an in-depth investigation of
a topic of your choice. The project can ei
ECE 8823 CVXOPT: Final project report guidelines
The final report for the project should be submitted by 11:59pm, Monday, May 1 (together
with the pdf copy of your poster). The final report is worth 4
ECE 8823 (Convex Optimization), Spring 2017
Homework #4
Due Wednesday March 1, in class
Reading: B&V, Chapter 9
1. Using you class notes, prepare a 1-2 paragraph summary of what we talked about
in cla
ECE 8823 (Convex Optimization), Spring 2017
Homework #5
Due Friday March 17, in class
Reading: B&V Chapter 5
1. Using you class notes, prepare a 1-2 paragraph summary of what we talked about
in class
Convex sets
In this section, we will be introduced to some of the mathematical
fundamentals of convex sets. In order to motivate some of the definitions, we will look at the closest point problem from
A second look at duality
In our first look at duality, we saw how we could recast the closest
point problem as a maximization problem over the set of separating
hyperplanes. Key to this was the suppor
Convergence of Newtons Method
Suppose that f (x) is strongly convex,
mI 2f (x) mI,
x RN ,
and that its Hessian is Lipschitz,
k2f (x) 2f (y)k Lkx yk2.
(The norm on the left-hand side above is the stand
Convex relaxation
The art and science of convex relaxation revolves around taking a
non-convex problem that you want to solve, and replacing it with
a convex problem which you can actually solve the s
Quasi-Newton Methods
A great resource for the material in this section is [NW06, Chapter
6].
Newtons method is great in that it converges to tremendous accuracy in a very surprisingly small number of
Convex functions
The domain dom f of a functional f : RN R is the subset of
RN where f is well-defined.
A function(al) f is convex if dom f is a convex set, and
f (x + (1 )y) f (x) + (1 )f (y)
for all
Lagrange duality
Another way to arrive at the KKT conditions, and one which gives
us some insight on solving constrained optimization problems, is
through the Lagrange dual. The dual is a maximization
The Karush-Kuhn-Tucker (KKT) conditions
In this section, we will give a set of sufficient (and at most times necessary) conditions for a x? to be the solution of a given convex optimization problem. T
`1 minimization
We will now focus on underdetermined systems of equations:
# samples
resolution/
bandwidth
=
data
acquisition
system
unknown
signal/image
Suppose we observe y = x0, and given y we atte
Existence of minimizers
We have just talked a lot about how to find the minimizer of an
unconstrained convex optimization problem. We have not talked
too much, at least not in concrete mathematical te
A first look at duality
It is time for our first look at one of the most important concepts in
convex optimization (and all of convex analysis in general): duality.
With the separating and supporting
Algorithms for constrained optimization
There are as many constrained optimization algorithms as there are
pages in the SIAM Journal of Optimization. But many can categorized into a small number of ba
Alternating direction method of multipliers
(ADMM)
Again, this material is mostly pulled from [BPC+10]. I have uploaded
the paper to T-square so you can download it if you are interested.
ADMM extends
Introduction to Optimization
In its most general form, an optimization program
min f0(x)
x
subject to x X
searches for the vector x RN that minimizes a given functional
f0 : RN R over a set X RN .
We
Max flow, Min cut
Consider the following network.
3
2
6
2
4
1
5
6
7
2
2
3
6
2
5
The nodes are routers, the edges are communications links; associated with each node is a capacity node 1 can communicat
ECE 8823 CVXOPT: Final project proposal guidelines
The first main deliverable for the final project is the project proposal to be submitted in class
on Friday April 7. Each group must submit a hardcop