OPTIMIZATION.pptx

Approaches in the sense that the optimal solution is

  • No School
  • AA 1
  • 23

This preview shows page 10 - 15 out of 23 pages.

approaches in the sense that the optimal solution is bracketed by these boundaries. Newton’s method is an open (instead of bracketing) approach, where the optimum of the one-dimensional function f(x) is found using an initial guess of the optimal value without the need for specifying lower and upper boundary values for the search region.
Image of page 10

Subscribe to view the full document.

NEWTON’S METHOD Newton’s method is an open approach to find the minimum or the maximum of a function f(x). It is very similar to the Newton-Raphson method to find the roots of a function such that f(x) = 0. Since the derivative of the function f(x), f(x)=0 at the functions maximum and minimum, the minima and the maxima can be found by applying the Newton-Raphson method to the derivative, essentially obtaining
Image of page 11
NEWTON’S METHOD We caution that before using Newton’s method to determine the minimum or the maximum of a function, one should have a reasonably good estimate of the solution to ensure convergence, and that the function should be easily twice differentiable.
Image of page 12

Subscribe to view the full document.

GOLDEN SECTION SEARCH METHOD The golden-section search is a technique for finding the extremum (minimum or maximum) of a strictly unimodal functionby successively narrowing the range of values inside which the extremum is known to exist. The technique derives its name from the fact that the algorithm maintains the function values for triples of points whose distances form a golden ratio. The algorithm is the limit of Fibonacci search (also described below) for a large number of function evaluations. Fibonacci search and golden-section search were discovered by Kiefer (1953) (see also Avriel and Wilde (1966)).
Image of page 13
MULTIDIMENSIONAL DIRECT SEARCH METHOD Pattern search (also known as direct search, derivative-free search, or black-box search) is a family of numerical optimization methods that does not require a gradient. As a result, it can be used on functions that are not continuous or differentiable. One such pattern search method is "convergence" (see below), which is based on the theory of positive bases. Optimization attempts to find
Image of page 14

Subscribe to view the full document.

Image of page 15

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern