{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture15

Lecture15 - Advanced Operations Research Techniques IE316...

This preview shows pages 1–8. Sign up to view the full content.

Advanced Operations Research Techniques IE316 Lecture 15 Dr. Ted Ralphs

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE316 Lecture 15 1 Reading for This Lecture Bertsimas Chapter 5
IE316 Lecture 15 2 Global Dependence on the Right-hand Side Vector Consider a family of polyhedra parameterized by the vector b P ( b ) = { x R n : Ax = b, x 0 } Note that S = { b : P ( b ) is nonempty } = { Ax : x 0 } is a convex set. We now consider the function F ( b ) = min x ∈P ( b ) c T x . In what follows, we will assume feasibility of the dual and hence that F ( b ) is finite for all b S . We will try to characterize the function F ( b ) .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE316 Lecture 15 3 Characterizing F(b) For a particular vector ˆ b , suppose there is a nondenegenerate optimal basic feasible solution given by basis B . As before, nondegeneracy implies that we can perturb ˆ b without changing the optimal basis. Therefore, we have F ( b ) = c T B B - 1 b = p T b, for b “close to” ˆ b. This means that in the vicinity of ˆ b , F ( b ) is a linear function of b . Consider the extreme points p 1 , . . . , p N of the dual polyhedron. There must be an extremal optimum to the dual and so we can rewrite F ( b ) as F ( b ) = max i =1 ,...,N ( p i ) T b Hence, F ( b ) is a piecewise linear convex function .
IE316 Lecture 15 4 Another Parameterization Now consider the function f ( θ ) = F ( ˆ b + θd ) for a particular vector ˆ b and direction d . Using the same approach, we obtain f ( θ ) = max i =1 ,...,N ( p i ) T ( ˆ b + θd ) , ˆ b + θd S. Again, this is a piecewise linear convex function .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
IE316 Lecture 15 5 The Set of all Dual Optimal Solutions Consider once more the function F ( b ) . Definition 1. A vector p R m is a subgradient of F at ˆ b if F ( ˆ b )+ p T ( b - ˆ b ) F ( b ) . Theorem 1. Suppose that the linear program min { c T x : Ax = ˆ b, x 0 } is feasible and that the optimal cost is finite. Then p is an optimal solution to the dual if and only if it is a subgradient of F at ˆ b .
IE316 Lecture 15 6 Global Dependence on the Cost Vector We have similar results for

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 21

Lecture15 - Advanced Operations Research Techniques IE316...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online