hw2 - EE364b Prof. S. Boyd EE364b Homework 2 1. Subgradient...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE364b Prof. S. Boyd EE364b Homework 2 1. Subgradient optimality conditions for nondifferentiable inequality constrained optimiza- tion. Consider the problem minimize f ( x ) subject to f i ( x ) , i = 1 , . . ., m, with variable x R n . We do not assume that f , . . ., f m are convex. Suppose that x and followsequal 0 satisfy primal feasibility, f i ( x ) , i = 1 , . . ., m, dual feasibility, f ( x ) + m summationdisplay i =1 i f i ( x ) , and the complementarity condition i f i ( x ) = 0 , i = 1 , . . ., m. Show that x is optimal, using only a simple argument, and definition of subgradient. Recall that we do not assume the functions f , . . ., f m are convex. 2. Optimality conditions and coordinate-wise descent for 1-regularized minimization. We consider the problem of minimizing ( x ) = f ( x ) + bardbl x bardbl 1 , where f : R n R is convex and differentiable, and 0. The number is the regularization parameter, and is used to control the trade-off between small...
View Full Document

This note was uploaded on 04/09/2010 for the course EE 360B taught by Professor Stephenboyd during the Fall '09 term at Stanford.

Page1 / 2

hw2 - EE364b Prof. S. Boyd EE364b Homework 2 1. Subgradient...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online