{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture12

# Lecture12 - Advanced Mathematical Programming IE417 Lecture...

This preview shows pages 1–6. Sign up to view the full content.

Advanced Mathematical Programming IE417 Lecture 12 Dr. Ted Ralphs

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE417 Lecture 12 1 Reading for This Lecture Chapter 6, Section 4
IE417 Lecture 12 2 Solving the Lagrangian Dual

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE417 Lecture 12 3 Solving the Lagrangian Dual Since Θ( μ, v ) is concave, we can use a line-search algorithm to maximize it. If Θ is differentiable, then Θ( μ * , v * ) T = ( g ( x * ) T , h ( x * ) T ) is an ascent direction. Move in this direction as far as is feasible. Move in a projected direction if μ i = 0 and g i ( x * ) < 0 for some index i . If Θ is not differentiable, then we have to work with subgradients. Finding the direction of steepest ascent in this case is an optimization problem. In practice, you may not want to move as far as possible each time.
IE417 Lecture 12 4 Subgradient Algorithm for the Lagrangian Dual The idea of the subgradient algorithm is to first fix μ, v and solve the Lagrangian subproblem to get x . Then update μ, v by moving in an ascent direction for Θ .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 8

Lecture12 - Advanced Mathematical Programming IE417 Lecture...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online