ConvexOptimizationII-Lecture07
Instructor (Stephen Boyd)
:Well, this is – we’re all supposed to pretend it’s Tuesday. I’ll
be in Washington, unfortunately. So today I’ll finish up and give a wrap up on the
Analytic Center Cutting-Plane Method, and then we’ll move on to, actually, one of the
coolest topics that really kind of finishes this whole section of the class, and that’s the
Ellipsoid Method. So we’ll look at this, and I’ll try to make clear what is useful and
what’s not. The Analytic Center Cutting-Plane Method is useful. When you have a
problem that you need to solve where you really do only have access to something like a
cutting plane or sub-gradient oracle, and for whatever reason – you have to look at those
reasons very carefully.
But if you have such a problem, this is an awfully good choice, and it’s going to beat any
sub-gradient type method very much. This is going to have much more computation,
obviously, per step, more storage, all sorts of stuff like that, compared to sub-gradient
method, which involves essentially zero computation and zero storage. They’re way,
way, way, way better than sub-gradient methods. Okay, so here’s the Analytic Center
Cutting-Plane Method. You’re given an initial polyhedron known to contain some targets
which might be feasible points, might be epsilon sub-optimal points.
Doesn’t really matter what it is. You find the analytic center of the polyhedron, which is
to say more precisely, you find the analytic center of the linear inequalities that represent
the polyhedron. And you query the cutting-plane oracle at that point. If that point is in
your target set, you quit. Otherwise, you add, you append this new cutting-plane to the
set. Of course, what happens now is your point X (K+1) is not in P (K+1) by definition.
Well, sorry. It might be in it if it’s a neutral cut, but it’s on the boundary.
What you need to do now is at the next step you’ll need to calculate the analytic center of
that new set, and I think – I won’t go through this. There’s a lot of different methods.
Infeasible Sartinutin Method is the simplest one. Maybe not the best, but we’ll go on and
just go to a problem, and I’ll show how this works. So this is a problem we’ve looked at
already several times. It’s piecewise linear minimization. It’s a problem with 20 variables
and 100 terms, and an optimal value around one. Let me add, just to make sure this is
absolutely certain and clear. If this problem were – if you just had to solve a problem like
that, it goes without saying you would not use something like an analytic center.
You’d just solve the LP. Let’s bear in mind that every one of these iterations requires an
effort that’s at least on the order of magnitude of simply solving this problem to ten
significant figures right off the bat by Barrier Method. So your code from last quarter,
your homework code which shouldn’t have been too many lines, will actually solve this
problem very, very quickly. In fact, anyone want to take a cut at how fast it would be?