This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 5: Constrained Optimization 5.5 Gradient Projection and Reduced Gradient Methods Rosens gradient projection method is based on projecting the search direction into the subspace tangent to the active constraints. Let us first examine the method for the case of linear constraints [7]. We define the constrained problem as minimize f ( x ) such that g j ( x ) = n X i =1 a ji x i b j , j = 1 , . . . , n g . (5 . 5 . 1) In vector form g j = a T j x b j . (5 . 5 . 2) If we select only the r active constraints ( j I A ), we may write the constraint equations as g a = N T x b = 0 , (5 . 5 . 3) where g a is the vector of active constraints and the columns of the matrix N are the gradients of these constraints. The basic assumption of the gradient projection method is that x lies in the subspace tangent to the active constraints. If x i +1 = x i + s , (5 . 5 . 4) and both x i and x i +1 satisfy Eq. (5.5.3), then N T s = 0 . (5 . 5 . 5) If we want the steepest descent direction satisfying Eq. (5.5.5), we can pose the problem as minimize s T f such that N T s = 0 , and s T s = 1 . (5 . 5 . 6) That is, we want to find the direction with the most negative directional deriva tive which satisfies Eq. (5.5.5). We use Lagrange multipliers and to form the Lagrangian L ( s , , ) = s T f s T N  ( s T s 1) . (5 . 5 . 7) The condition for L to be stationary is L s = f N  2 s = 0 . (5 . 5 . 8) Premultiplying Eq. (5.5.8) by N T and using Eq. (5.5.5) we obtain N T f N T N = 0 , (5 . 5 . 9) or = ( N T N ) 1 N T f . (5 . 5 . 10) 176 Section 5.5: Gradient Projection and Reduced Gradient Methods So that from Eq. (5.5.8) s = 1 2 [ I N ( N T N ) 1 N T ] f = 1 2 P f . (5 . 5 . 11) P is the projection matrix defined in Eq. (5.3.8). The factor of 1 / 2 is not significant because s defines only the direction of search, so in general we use s = P f . To show that P indeed has the projection property, we need to prove that if w is an arbitrary vector, then Pw is in the subspace tangent to the active constraints, that is Pw satisfies N T Pw = 0 . (5 . 5 . 12) We can easily verify this by using the definition of P . Equation (5.3.8) which defines the projection matrix P does not provide the most efficient way for calculating it. Instead it can be shown that P = Q T 2 Q 2 , (5 . 5 . 13) where the matrix Q 2 consists of the last n r rows of the Q factor in the QR factorization of N (see Eq. (5.3.9)). A version of the gradient projection method known as the generalized reduced gradient method was developed by Abadie and Carpentier [8]. As a first step we select r linearly independent rows of N , denote their transpose as N 1 and partition N T as N T = [ N 1 N 2 ] . (5 . 5 . 14) Next we consider Eq. (5.5.5) for the components s i of the direction vector. The r equations corresponding to N 1 are then used to eliminate r components of s and obtain a reduced order problem for the direction vector.obtain a reduced order problem for the direction vector....
View
Full
Document
This note was uploaded on 06/07/2011 for the course EGM 6365 taught by Professor Staff during the Spring '08 term at University of Florida.
 Spring '08
 Staff
 Optimization, Strain

Click to edit the document details