fundamental-engineering-optimization-methods.pdf

# Once a search direction d has been determined a step

• 162

This preview shows pages 154–157. Sign up to view the full content.

Once a search direction d has been determined, a step-size along d needs to be computed by solving the line search problem. We next discuss the descent function approach that is used to resolve the line search step in the SQP solution process. 7.6.1 Descent Function Approach In SQP methods, the line search solution is based on minimization of a descent function that penalizes constraint violations. The following descent function has been proposed in literature (Arora, p. 521): Ȱሺ࢞ሻ ൌ ݂ሺ࢞ሻ ൅ ܴܸሺ࢞ሻ (7.50)

This preview has intentionally blurred sections. Sign up to view the full version.

Download free eBooks at bookboon.com Fundamental Engineering Optimization Methods 155 ±umerical Optimization Methods where ݂ሺ࢞ሻ represents the cost function value, ܸሺ࢞ሻ represents the maximum constraint violation, and ܴ ൐ Ͳ is a penalty parameter. The descent function value at the current iteration is expressed as: Ȱ ൌ ݂ ൅ ܴܸ ³ ܴ ൌ ሼܴ ǡ ݎ (7.51) where ܴ is the current value of the penalty parameter, ݎ is the current sum of the Lagrange multipliers, and ܸ is the maximum constraint violation in the current step. The latter parameters are computed as: ݎ ൌ σ ݑ ௜ୀଵ ൅ σ หݒ ௝ୀଵ ܸ ሼͲǢ ݃ ǡ ݅ ൌ ͳǡ Ǥ Ǥ Ǥ ǡ ݉Ǣ ห݄ หǡ ݆ ൌ ͳǡ ǥ ǡ ݌ሽ (7.52) where absolute values of the Lagrange multipliers and constraint violations for equality constraints are used. Next, the line search subproblem is defined as: Ȱሺߙሻ ൌ Ȱ൫࢞ ൅ ߙࢊ (7.53) The above problem may be solved via the line search methods described in Sec. 7.2. An algorithm for solving the SQP problem is presented below: SQP Algorithm (Arora, p. 526) : Initialize: choose ǡ ܴ ൌ ͳǡ ߝ ൐ Ͳǡ ߝ ൐ Ͳ ² For ݇ ൌ Ͳǡͳǡʹǡ ǥ 1. Compute ݂ ǡ ݃ ǡ ݄ ǡ ࢉǡ ܾ ǡ ݁ · FRPSXWH ܸ ² 1. Formulate and solve the QP subproblem to obtain d k and the Lagrange multipliers DQG ² 2. If ܸ ൑ ߝ DQG ฮࢊ ฮ ൑ ߝ ³ VWRS² 3. Compute ܴ · formulate and solve line search subproblem to obtain ߙ 4. Set ௞ାଵ ՚ ࢞ ൅ ߙࢊ ǡ ܴ ௞ାଵ ՚ ܴǡ ݇ ՚ ݇ ൅ ͳ ² It can be shown that the above algorithm is convergent, i.e., Ȱ൫࢞ ൯ ൑ Ȱሺ࢞ ³ DQG WKDW converges to the KKT point in the case of general constrained optimization problems (Arora, p. 525).
Download free eBooks at bookboon.com Click on the ad to read more Fundamental Engineering Optimization Methods 156 ±umerical Optimization Methods 7.6.2 SQP with Approximate Line Search The above SQP algorithm can be used with approximate line search methods, similar to Arjimo’s rule (Sec. 7.2.2) as follows: let ݐ ǡ ݆ ൌ Ͳǡͳǡ ǥ denote a trial step size, ௞ାଵǡ௝ denote the trial design point, ݂ ௞ାଵǡ௝ ൌ ݂ሺ ࢞ ௞ାଵǡ௝ denote the function value at the trial solution, and Ȱ ௞ାଵǡ௝ ൌ ݂ ௞ାଵǡ௝ ൅ ܴܸ ௞ାଵǡ௝ denote the penalty function at the trial solution. The trial solution is required to satisfy the following

This preview has intentionally blurred sections. Sign up to view the full version.

This is the end of the preview. Sign up to access the rest of the document.
• Winter '17
• sam

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern