optimization.docx

Note that the only new calculation is done to

  • No School
  • AA 1
  • 30

This preview shows page 15 - 19 out of 30 pages.

Note that the only new calculation is done to determine the new x 2 . x l = x l x u = x 1 x 1 = x 2 x 2 = x u 5 1 2 ( x u x l ) (6) Step 3 If x u x l < ε (a sufficiently small number), then the maximum occurs at x u + x l 2 and stop iterating, else go to Step 2.
Image of page 15

Subscribe to view the full document.

16 V. Multidimensional Direct Search Method Methods for finding optimal solutions in multidimensional spaces are not too different than their cousins used in finding optimal solutions in a single dimension. The trade-off between general applicability versus computational complexity also exists in multidimensional optimization. The multidimensional direct search methods we will cover in this chapter, like the one-dimensional Golden Section Search method does not require a differentiable function. These methods are sometimes referred to as Zeroth Order Algorithms because it is not required to differentiate the optimization function. Probably the most obvious solution to an optimization problem in multidimensional space is to systematically evaluate every possible solution and select the maximum or the minimum depending on our objective. This is a very generally applicable approach and may even be useful if the solution space is relatively small. However, as the dimensions of the problem space, (number of independent variables), increase, the computational complexity of this solution approach quickly becomes unmanageable. Therefore, we are interested in methods that
Image of page 16
17 intelligently search through the solution space to find an optimal solution without enumerating all possible solutions. It is important to note that some of the popular optimization techniques you may have heard of such as simulated annealing, tabu search, neural networks and genetic algorithms all fall under this family of optimization techniques. The coordinate cycling search method, starts from an initial point and looks for an optimal solution along each coordinate direction iteratively. For example, using a function f ( x , y ) with two independent variables x and y , and starting at point ( x 0 , y 0 ) ; the first iteration will move along direction (1, 0), until an optimal solution is found for the function f ( x , y 0 ) . The next search involves searching along the direction (0,1) to determine the optimal value for the function f ( x 1 , y ) where x 1 is the solution found in the previous search. Once searches in all directions are completed, the process is repeated in the next cycle. The search will continue until convergence occurs or a predetermined error limit is met. The search along each coordinate direction can be conducted by using anyone of the one-dimensional search techniques previously covered. A visual representation of how the search converges is shown below in Figure1.
Image of page 17

Subscribe to view the full document.

Optimal point Initial search point Point after first cycle Point after third cycle Point after second cycle length 18 VI. Multidimensional Gradient Method The difference between gradient and direct search methods in multi-dimensional optimization is similar to the difference between these approaches in one-dimensional optimization. Direct search methods are useful when the derivative of the optimization function is not available to effectively guide the search for the optimum. While direct search methods explore the parameter space in a systematic manner, they are not computationally very efficient.
Image of page 18
Image of page 19

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern