This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ¯ x for this NLP as described in KarushKuhnTucker Theorem. Using these conditions and the theorem, prove that ¯ x is optimal. b) Suppose that we replace the objective function by, min − x 1 + αx 2 . Indicate for what values of α is ¯ x going to be optimal for (NLP). Exercise 3 (15 marks) . 1 2 a) Let g 1 , g 2 , g 3 : ℜ → ℜ be defined by g 1 ( x ) := − x, g 2 ( x ) := 2 , g 3 ( x ) := x. Plot these functions on ℜ 2 . Identify on your plot, the function ˆ g defined by, ˆ g ( x ) := max { g 1 ( x ) , g 2 ( x ) , g 3 ( x ) } . Prove that ˆ g is a convex function. b) Suppose g 1 , g 2 , . . . , g m : ℜ n → ℜ are given convex functions. Define the function ˆ g : ℜ n → ℜ where ˆ g ( x ) := max { g 1 ( x ) , g 2 ( x ) , . . . , g m ( x ) } . Prove that ˆ g is a convex function....
View
Full Document
 Spring '10
 GUENIN
 Derivative, Monotonic function, Convex function

Click to edit the document details