This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE364b Prof. S. Boyd EE364b Homework 4 1. Projection onto the probability simplex. In this problem you will work out a simple method for finding the Euclidean projection y of x ∈ R n onto the probability simplex P = { z  z followsequal , 1 T z = 1 } . Hints. Consider the problem of minimizing (1 / 2) bardbl y − x bardbl 2 2 subject to y followsequal 0, 1 T y = 1. Form the partial Lagrangian L ( y, ν ) = (1 / 2) bardbl y − x bardbl 2 2 + ν ( 1 T y − 1) , leaving the constraint y followsequal 0 implicit. Show that y = ( x − ν 1 ) + minimizes L ( y, ν ) over y followsequal 0. 2. Minimizing expected maximum violation. We consider the problem of minimizing the expected maximum violation of a set of linear constraints subject to a norm bound on the variable, minimize E max( b − Ax ) + subject to bardbl x bardbl ∞ ≤ 1 , where the data A ∈ R m × n and b ∈ R m are random. We consider a specific problem instance with m = 3 and n = 3. The entries of A and b vary uniformly (and independently) ± . 1 around their expected values, E A = 1 1 1 / 2 1 1 1 / 2 , E b = 9 / 10 1 9 / 10 . (a) Solution via stochastic subgradient . Use a stochastic subgradient method with step size 1 /k to compute a solution x stoch , starting from x = 0, with M = 1 subgradient sample per iteration. Run the algorithm for 5000 iterations. Estimate the objective value obtained by x stoch using Monte Carlo, with M = 1000 samples. Plot the distribution of max( b − Ax stoch ) from these samples. (In this plot, points to the left of 0 correspond to no violation of the inequalities.) (b) Maximum margin heuristic...
View
Full
Document
This note was uploaded on 04/09/2010 for the course EE 360B taught by Professor Stephenboyd during the Fall '09 term at Stanford.
 Fall '09
 StephenBoyd

Click to edit the document details