ConvexOptimizationII-Lecture11 - -Lecture11...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
ConvexOptimizationII-Lecture11 Instructor (Stephen Boyd) :Hey, I guess we’ll start. Let me just make a couple of announcements. I guess we’ve combined the rewrite of the preliminary project proposal with the mid-term project review, and I think that’s due this Friday. Good, okay, it’s due Friday. So that’s going to be that. Please you’re welcome to show me or the TAs if you want a format scan. the TAs are now as nasty as I am, and they can scan something, just walk down it and say, ‘That’s typesetting error. I’m not going to read any farther.’ Or they’ll read down a half-page and say, ‘What are the variables?’ and then throw it at you with a sneer. So I’ve trained them pretty well. The danger, of course, is then they start applying that to the stuff I write, which has happened in the past. They say things like, ‘This isn’t consistent. Use this phrase on this page and this one on the other one.’ And you look at the two, and you say, ‘Yeah, that’s true, all right.’ The executive decision was made. We’re going to switch around. It’s not the natural order of the class, in fact, but it fits better with people who are doing projects. So a bunch of people are doing projects that involve non-convex problems, and so today we switched around, and we’re going to do sequential convex programming first, and then we’ll switch back to problems that are convex problems and things like that. We’ll do large-scale stuff next. The good news is that we can actually pull this off, I think, in one day. So we’re going to come back later. There’ll be several other lectures on problems that are not convex and various methods. There’s going to be a problem on reluxations. We’re going to have a whole study of L1 type methods for sparse solutions. Those will come later. But this is really our first foray outside of convex optimization. So it’s a very simple, basic method. There’s very, very little you can say about it theoretically, but that’s fine. It’s something that works quite well. Don’t confuse it – although it’s related to something called sequential quadratic programming. That’s something you’ll hear about a lot if you go to Google and things like that. Those are things that would come up. But I just wanted to collect together some of the topics that come up. Okay, so let’s do sequential convex programming. Let’s see here. There we go. Okay, so I guess it’s sort of implicit for the entire last quarter and this one. I mean the whole point of using convex optimization methods on convex problems is you always get the global solution. I mean up to numerical accuracy. So marginal and numerical accuracy, you always get the global solution, and it’s always fast. And that’s fast according to theorist. If you want a complexity theory, there are bounds that grow like polynomials and various
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/09/2010 for the course EE 360B taught by Professor Stephenboyd during the Fall '09 term at Stanford.

Page1 / 21

ConvexOptimizationII-Lecture11 - -Lecture11...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online