Lecture 4

The second x x line is the assumption that x

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ions, oftentimes constraints are linear. In that case GCQ is automatically satisfied, so you don’t need to check it (exercise). It is known that the GCQ is the weakest possible condition [1]. 5 Sufficient condition The Karush-Kuhn-Tucker theorem provides necessary conditions for optimalSecond Order ity: if the constraint qualification holds, then a local solution must satisfy Sufficient Condition the Karush-Kuhn-Tucker conditions (first-order conditions and complementary Not needed slackness conditions). Note that the KKT conditions are equivalent to ∇x L(¯, λ, µ) = 0, x (4) where L(x, λ, µ) is the Lagrangian. (4) is the first-order necessary condition of the unconstrained minimization problem min L(x, λ, µ). (5) x ∈ RN Below I give a sufficient condition for optimality. Proposition 6. Suppose that x is a solution to the unconstrained minimization ¯ problem (5) for some λ ∈ RI and µ ∈ RJ . If gi (¯) ≤ 0 and λi gi (¯) = 0 for all x x + i and hj (¯) = 0 for all j , then x is a solution to the constrained minimization x ¯ problem (1). Proof. Take any x such that gi (x) ≤ 0 for all i and hj (x) = 0 for all j . Then J I µj hj (¯) x λi gi (¯) + x f (¯) = f (¯) + x x j =1 i=1 = L(¯, λ, µ) ≤ L(x, λ, µ) x J I µj hj (x) ≤ f (x). λi gi (x) + = f (x) + i=1 j =1 The first line is due to λi gi (¯) = 0 for all i and hj (¯) = 0 for all j . The second x x line is the assumption that x minimizes L(·, λ, µ). The third line is due to λi ≥ 0 ¯ and gi (x) ≤ 0 for all i and hj (x) = 0 for all j . 6 Constrained maximization Finally, we briefly discuss maximization. Although maximization is equivalent to minimization by flipping the sign of the objective function, doing so every time is awkward. So consider the maximization problem maximize f (x) subject to gi (x) ≥ 0 hj (x) = 0 (i = 1, . . . , I ) (j = 1, . . . , J ). (6) 2014W Econ 172B Operations Research (B) Alexis Akira Toda (6) is equivalent to the minimization problem minimize − f (x) subject to − gi (x) ≤ 0 − hj (x) = 0 (i = 1, . . . , I ) (j = 1, . . . , J ). (7) Assuming that x is a local solution and the constraint qualification holds, then ¯ the KKT conditions are I − ∇f (¯) − x J λi ∇gi (¯) − x i=1 µj ∇hj (¯) = 0, x (8a) j =1 (∀i) λi (−gi (¯)) = 0. x (8b) But (8) is equivalent to (3). For this reason, it is customary to formulate a maximization problem as in (6) so that the inequality constraints are always “greater than or equal to zero”. As an example, consider a consumer with utility function u(x) = α log x1 + (1 − α) log x2 , where 0...
View Full Document

This document was uploaded on 02/18/2014 for the course ECON 172b at UCSD.

Ask a homework question - tutors are online