simple_route_art

# simple_route_art - Simple Routines for Optimization Robert...

This preview shows pages 1–6. Sign up to view the full content.

Simple Routines for Optimization Robert M. Freund with assistance from Brian W. Anthony February 12, 2004 c ± 2004 Massachusetts Institute of Technology. 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
1 Outline A Bisection Line-Search Algorithm for 1-Dimensional Optimization The Conditional-Gradient Method for Constrained Optimization (Frank- Wolfe Method) Subgradient Optimization Application of Subgradient Optimization to the Lagrange Dual Prob - lem 2 A Bisection Line-Search Algorithm for 1 -Dimensional Optimization Consider the optimization problem: P : minimize x f ( x ) n s.t. x ∈± . Let us suppose that f ( x ) is a diﬀerentiable convex function. In a typical algorithm for solving P we have a current iterate value ¯ x and we choose a ¯ direction d ¯ by some suitable means. The direction d is usually chosen to be a descent direction , deﬁned by the following property: x + ±d ¯ ) <f f x ) for all ±> 0 and suﬃciently small . We then typically also perform the 1-dimensional line-search optimization: α := arg min f ¯ x + αd ¯ ) . α Let h ( α ):= f x + αd ¯ ) , whereby h ( α ) is a convex function in the scalar variable α , and our problem is to solve for ¯ α := arg min h ( α ) . α 2
We therefore seek a value ¯ α for which h α )=0 . It is elementary to show that x + αd ) T ¯ h ( α )= f ¯ d. Property: If d ¯ is a descent direction at ¯ x , then h (0) < 0. Because h ( α ) is a convex function of α ,wea lsohave : Property: h ( α ) is a monotone increasing function of α . Figure 1 shows an example of convex function of two variables to be optimized. Figure 2 shows the function h ( α ) obtained by restricting the function of Figure 1 to the line shown in that ﬁgure. Note from Figure 2 that h ( α ) is convex. Therefore its ﬁrst derivative h ( α ) will be a monotonically increasing function. This is shown in Figure 3. Because h ( α ) is a monotonically increasing function, we can approxi- α , the point that satisﬁes h mately compute ¯ α ) = 0, by a suitable bisection α that h method. Suppose that we know a value ˆ α ) > 0. Since h (0) < 0 α α ) > 0, the mid-value ˜ and h α = 0+ ˆ is a suitable test-point. Note the 2 following: If h α ) = 0, we are done. If h α in the interval (0 , ˜ α ) > 0, we can now bracket ¯ α ). α ) < 0, we can now bracket ¯ α, ˆ If h α in the interval ( ˜ α ). This leads to the following bisection algorithm for minimizing h ( α f x + ¯ αd ) by solving the equation h ( α ) 0. Step 0. Set k =0. Set α l := 0 and α u := ˆ α . α = α u + α l and compute h Step k. Set ˜ α ). 2 If h α .S e t k k +1. α ) > 0, re-set α u := ˜ If h α e t k k α ) < 0, re-set α l := ˜ 3

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
0 0.5 1 1.5 2 2.5 3 −2 0 2 4 6 8 10 −80 −70 −60 −50 −40 −30 −20 −10 0 10 20 x 1 x 2 f(x 1 ,x 2 ) Figure 1: A convex function to be optimized. 4
0 −50 −40 −30 −20 −10 h( α ) −60 −0.4 −0.2 0 0.2 0.4 0.6 0.8 α Figure 2: The 1-dimensional function h ( α ).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 12/04/2011 for the course ESD 15.094 taught by Professor Jiesun during the Spring '04 term at MIT.

### Page1 / 36

simple_route_art - Simple Routines for Optimization Robert...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online