{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# hw3 - EE364b Prof S Boyd EE364b Homework 3 1 Minimizing a...

This preview shows pages 1–2. Sign up to view the full content.

EE364b Prof. S. Boyd EE364b Homework 3 1. Minimizing a quadratic. Consider the subgradient method with constant step size α , used to minimize the quadratic function f ( x ) = (1 / 2) x T Px + q T x , where P 0. For which values of α do we have x ( k ) x , for any x (1) ? What value of α gives fastest asymptotic convergence? 2. Step sizes that guarantee moving closer to the optimal set. Consider the subgradient method iteration x + = x αg , where g ∂f ( x ). Show that if α < 2( f ( x ) f ) / bardbl g bardbl 2 2 (which is twice Polyak’s optimal step size value) we have bardbl x + x bardbl 2 < bardbl x x bardbl 2 , for any optimal point x . This implies that dist ( x + , X ) < dist ( x, X ). (Methods in which successive iterates move closer to the optimal set are called ejer monotone . Thus, the subgradient method, with Polyak’s optimal step size, is F´ ejer monotone.) 3. A variation on alternating projections. We consider the problem of finding a point in the intersection C negationslash = of convex sets C 1 , . . . , C m . To do this, we use alternating projections to find a point in the intersection of the two sets C 1 × · · · × C m R mn and { ( z 1 , . . . , z m ) R mn | z 1 = · · · = z m } ⊆ R mn . Show that alternating projections on these two sets is equivalent to the following it- eration: project the current point in R n onto each convex set, and then average the results. Draw a simple picture to illustrate this.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}