L26_illpose

L26_illpose - Lecture 26: Badly Conditioned Convex Problems...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture 26: Badly Conditioned Convex Problems April 25, 2007
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Lecture 26 Outline Shor’s space-dilation method [variable metric method] Tikhonov’s regularization Nesterov’s averaging algorithm Convex Optimization 1
Background image of page 2
Lecture 26 Condition Number of Level Sets Let f be convex with dom f = R n and X R n be convex and closed. minimize f ( x ) subject to x X Convergence rate of subgradient methods depends on the condition numbers of level sets of f and the set X Large condition numbers characterize the functions having prolonged level sets along some directions This corresponds to subgradient directions being almost perpendicular to the directions toward the minima (or the better points in general) Convex Optimization 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Lecture 26 Change of Variables In the differentiable case, one way of improving the conditioning is “transformation of space” [varying the metric] Introduce a transformation y = Bx such that the condition numbers are improved in the y -space The choice of the transformation B is not always obvious In the non-differentiable case, it makes sense to consider transformations of space that improve the angle formed by the subgradient direction and the directions toward the minima Such an approach using a space dilation in the direction of a subgra- dient was proposed independently by Yudin and Nemirovski 1976, and Shor 1977 Convex Optimization 3
Background image of page 4
Lecture 26 Illustration Consider a point where a subgradient is almost perpendicular to a direction d pointing toward the minima Dilate the space along the direction of the subgradient s Any vector x can be decomposed as x = ξ + v - ξ is a component of x parallel to s [the projection of x on s ] - v is a vector orthogonal to s Under this “dilation” transformation: The component v orthogonal to s is unchanged The component parallel to s is stretched by a factor [dilation parameter] As a result, in the transformed space [ s unchanged] the vector s is “far” from being perpendicular to the transformed direction d Convex Optimization 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 18

L26_illpose - Lecture 26: Badly Conditioned Convex Problems...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online