This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: IEOR E4007 G. Iyengar Nov. 5th, 2008 Homework # 4 Due: Nov. 19th 1. Problem on unconstrained optimization For each of the following optimization problems either verify that the given x is a stationary point or find a direction d that locally improves at x . (a) max 10 x 2 1 + 12 ln( x 2 ), x = (1 , 2) The gradient ∇ f ( x ) at x = (1 , 2) is given by ∇ f ( x ) = parenleftbigg 20 x 1 12 /x 2 parenrightbigg = parenleftbigg 20 6 parenrightbigg Since the gradient is not equal to zero, the pt. (1 , 2) is not a stationary pt. Since this is a maximization problem, the direction Δ x = ∇ f ( x ) improves the objective. (b) max x 1 x 2 10 x 1 + 4 x 2 , x = ( 4 , 10) The gradient at x = ( 4 , 10) is given by ∇ f ( x ) = parenleftbigg x 2 10 x 1 + 4 parenrightbigg = , i.e. x = ( 4 , 10) is a stationary pt. 2. Local optimality For each of the following functions f , classify the specified x as a definitely a local maximum, possibly local maximum, definitely local minimum, possibly local minimum or none of the above. (a) f ( x ) = x 2 1 6 x 1 x 2 9 x 2 2 , x = ( 3 , 1) The gradient at x = ( 3 , 1) is given by ∇ f ( x ) = parenleftbigg 2 x 1 6 x 2 6 x 1 18 x 2 parenrightbigg = So a stationary pt. The Hessian at x is ∇ 2 f ( x ) = bracketleftbigg 2 6 6 18 bracketrightbigg and its eigenvalues are { , 20 } . Thus, the pt. is possibly a local maximum. 1 (b) f ( x ) = 12 x 2 x 2 1 + 3 x 1 x 2 3 x 2 2 , x = (12 , 8) The gradient at x = (12 , 8) is given by ∇ f ( x ) = parenleftbigg 2 x 1 + 3 x 2 12 + 3 x 1 6 x 2 parenrightbigg = So a stationary pt. The Hessian at x is ∇ 2 f ( x ) = bracketleftbigg 2 3 3 6 bracketrightbigg and its eigenvalues are { . 3944 , 7 . 6056 } . Therefore, the pt. is definitely a local maximum. (c) f ( x ) = 6 x 1 + ln( x 1 ) + x 2 2 , x = (1 , 2) The gradient at x = (1 , 3) is given by ∇ f ( x ) = parenleftbigg 6 + 1 /x 1 2 x 2 parenrightbigg = parenleftbigg 7 6 parenrightbigg Not a stationary pt. (d) f ( x ) = 4 x 2 1 + 3 /x 2 8 x 1 + 3 x 2 , x = (1 , 1) The gradient at x = (1 , 1) is given by ∇ f ( x ) = parenleftbigg 8 x 1 8 3 x 2 2 + 3 parenrightbigg = . A stationary pt. The Hessian at x is ∇ 2 f ( x ) = bracketleftbigg 8 0 0 6 bracketrightbigg and its eigenvalues are { 6 , 8 } . Therefore, definitely a local minimum. 3. Problem on recognizing convex functions/sets Determine whether each of the following is a convex program (a) min { x 1 + x 2 : x 1 x 2 ≤ 9 ,  x 1  ≤ 5 ,  x 2  ≤ 5 } The constraint x 1 x 2 ≤ 9 is nonconvex. Consequently, this problem is unlikely to be convex. Consider two points x = (5 , 9 / 5) prime , y = (9 / 5 , 5) prime and θ = 1 / 2. Then z = 0 . 5 x + 0 . 5 y = 17 / 5(1 , 1) prime violates the constraint (17 / 5) 2 = 11 . 56 > 9. Thus, the feasible region is not convex and the optimization problem is not convex.the feasible region is not convex and the optimization problem is not convex....
View
Full Document
 Summer '09
 OptimizationModelsandMethods
 Optimization, Mathematical optimization, σ, optimization problem

Click to edit the document details