global-opt - Global optimization Calculus-based methods are...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Global optimization Calculus-based methods are necessarily local methods: they do not guar- antee to find the global minimum unless the function is in a restricted cate- gory (convex, for example). However, users typically want to find the global minimum. Doing this can be a very costly task, but worthwhile. Because of this, there have been a large number of heuristic methods which converge in some sense to the global minimum. Very often these methods are de- scribed as search methods. Here I describe a number of these methods and how they relate to the local methods we will describe. Some of them have theoretical support, many have limited or no theoretical support. Since derivatives only give local information, most global methods just use function values to work. There are a number of optimization meth- ods that do not use derivatives, such as the well-known NelderMead sim- plex method. These direct methods are not global optimization methods. They are, in fact, local methods that do not use derivatives. There has been a divergence between the community that develop global optimization algorithms such as simulated annealing and genetic algorithms, and the community that develops more traditional calculus-based meth- ods. Very often the global optimization community has been more con- cerned with creating algorithms (heuristics) without much concern for rig- orous analysis, and various heuristics are compared. When methods such as simulated annealing and genetic algorithms are compared with more traditional algorithms, the traditional algorithms tend to win out in speed and accuracy, especially when simple random start techniques are used. Exhaustive search The idea is simple and expensive. Very often the number of possible sig- nificantly different values of the input to an objective function is extremely large, and beyond the number that can be tested in reasonable time. Never- theless, this is a good reference point: any algorithm that takes significantly longer than exhaustive search should not be used. Simulated annealing The idea of simulated annealing is to do local search, except that we allow increases in the objective function value with a certain probability. This 1 makes it an example of a random search or stochastic search algo- rithm. The name simulated annealing comes from analogy with ther- modynamics: when a material is in a high energy disordered state and is cooled quickly, it will likely end up in a (slightly lower but still) high en-...
View Full Document

This note was uploaded on 04/01/2012 for the course 22M 174 taught by Professor Davidstewart during the Spring '12 term at University of Iowa.

Page1 / 5

global-opt - Global optimization Calculus-based methods are...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online