This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 14: Algorithms Unconstrained Optimization March 7, 2007 Lecture 14 Outline Terminology and Assumptions Gradient Descent Method Newtons Method Convex Optimization 1 Lecture 14 Unconstrained Minimization minimize f ( x ) Assumptions: The function f is convex and continuously differentiable over dom f The optimal value f * = inf x f ( x ) is finite and attained [ x * exists ] Minimization Methods Produce a sequence of points x k dom f , k = 0 , 1 , . . . , such that f ( x k ) f * Can be interpreted as iterative methods solving optimality condition f ( x * ) = 0 Convex Optimization 2 Lecture 14 Initial Point and Level Set Assumptions Algorithms that we will consider require a starting point x such that It is feasible: x dom f The level set L = { x  f ( x ) f ( x ) } is closed The closedness condition: Not always easy to verify Satisfied when all level sets are closed ; this is guaranteed when: The epigraph epi f of f is closed [ f closed; f lsc] The domain of f is the entire space: dom f = R n f increases to + as the boundary of the domain is approached: f ( x ) as x bd ( dom f ) An example of a differentiable convex function with closed level sets: f ( x ) = m X i =1 ln( b i a T i x ) dom f = { x  Ax b } Convex Optimization 3 Lecture 14 Strong Convexity Assumption and Implications In convergence analysis : strong convexity of f is often used. Strong convexity assumption : f is twice continuously differentiable and there exists an m > such that 2 f ( x ) mI for all x L Implications : Lower Bound on f over L : f ( y ) f ( x ) + f ( x ) T ( y x ) + m 2 k x y k 2 2 for all x, y L (1) minimize w/r to y in the righthand side: f ( y ) f ( x ) 1 2 m k f ( x ) k 2 minimum over y L : f ( x ) f * 1 2 m k f ( x ) k 2 Useful as stopping criterion (if you know m ) Relation (1) with x = x and f ( y ) f ( x ) imply that L is bounded Convex Optimization 4 Lecture 14 Upper Bound on Hessian and f over the Level Set For a strongly convex f : The level set L = { x  f ( x ) f ( x ) } is bounded (just shown) The maximum eigenvalue of the Hessian 2 f ( x ) is a continuous function of x over L Hence, the maximum eigenvalue of the Hessian is bounded over L : there is a constant M such that 2 f ( x ) MI for all x L Upper Bound on f over L : f ( y ) f ( x ) + f ( x ) T ( y x ) + M 2 k y x k 2 for all x, y L minimize over y L in both sides: f * f ( x ) 1 2 M k f ( x ) k 2 for all x L Convex Optimization 5 Lecture 14...
View
Full
Document
 Spring '07
 AngeliaNedich

Click to edit the document details