04_LocalSearch

04_LocalSearch - 9/16/2009 Scaling Up So far, we have...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 9/16/2009 Scaling Up So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). The current best such algorithms (RBFS / SMA*) can handle search spaces of up to 10100 states ~ 500 binary valued variables. But search spaces for some realworld problems might be much bigger e.g. 1030,000 states. Here, a completely different kind of search is needed. Local Search Methods Local Search Optimization Problems We're interested in the Goal State not in how to get there. Optimization Problem: State: vector of variables Objective Function: f: state Goal: find state that maximizes or minimizes the objective function Examples: VLSI layout, job scheduling, map coloring, NQueens Representations for 8Q problem Heuristic for 8Q problem? Heuristic for 8Q problem? 1 9/16/2009 Local Search Methods Applicable to optimization problems. Basic idea: use a single current state don't save paths followed generally move only to successors/neighbors of that state Generally require a complete state description. HillClimbing Search function HILL-CLIMBING (problem ) returns a solution state inputs: problem , a problem static: current , a node t MAKE-NODE(INITIAL-STATE[ bl current MAKE NODE(INITIAL STATE[problem ]) loop do next a highest-valued successor of current if VALUE[next] < VALUE[current] then return current current next end Evaluation Hill Climbing Pathologies Objective function Global Maximum Shoulder Local Maximum "flat" local maximum Current State State Space Value of current solution 2 9/16/2009 Local Maximum Example Neutral "Sideways" moves Take new state even if not strictly better (just equal) Allows exploring plateaus ...But can get into cycles 8Q problem: Stuck 86% time 4 steps to succeed 3 to get stuck Succeeds 94% Neutral "Sideways" moves 8Q problem without sideways Stuck 86% time 4 steps to succeed 3 to get stuck 8Q problem with sideways Succeeds 94% time 21 steps to succeed 64 to get stuck A = With Sideways B = Without Sideways Random restarts Random restarts: Simply restart at a new random state after a predefined number of steps. Is it worth it? If probability of success is p, then Expected number of trials to success is 1/p Improvements to Basic Local Search Issue: How to move more quickly to successively higher plateaus and avoid getting "stuck" local maxima. Idea: Introduce downhill moves ("noise") to escape from long plateaus (or true local maxima). Strategies: Randomrestart hillclimbing Randomrestart hillclimbing => Multiple runs from randomly generated initial states Tabu search Simulated Annealing Genetic Algorithms Local Beam Search Local Beam Search: Run the random starting points in parallel, always keeping the k most promising states current k k initial states l for t 1 to infinity do new expand every state in current if f(bestinnew) < f(bestincurrent) then return bestincurrent current best k states in new 3 9/16/2009 Ridges Simulated Annealing Idea: Use conventional hillclimbing techniques, but occasionally take a step in a direction other than that in which the rate of change is maximal. As time passes, the probability that a downhill step is taken is gradually reduced and the size of any down hill step taken is gradually reduced and the size of any downhill step taken is decreased. Kirkpatrick et al. 1982; Metropolis et al.1953. Simulated Annealing Algorithm current initial state for t 1 to infinity do T schedule[t] if T = 0 then return current next randomly selected successor of current randomly selected successor of current E f(next) f(current) if E > 0 then current next else current next only with probability eE/T Genetic Algorithms Approach mimics evolution. Usually presented using a rich (and different) vocabulary: fitness, populations, individuals, genes, crossover, mutations, etc. Still, can be viewed quite directly in terms of standard local search. local search Genetic Algorithms: Recombination Objective function Global Maximum Genetic Algorithms Inspired by biological processes that produce genetic change in populations of individuals. Shoulder Local Maximum "flat" local maximum Genetic algorithms (GAs) are local search procedures that usually the following basic elements: A Darwinian notion of fitness: the most fit individuals have the best chance of survival and reproduction. survival and reproduction. "Crossover" operators: Parents are selected. Parents pass their genetic material to children. Mutation: individuals are subject to random changes in their genetic material. State Space Value of current solution 4 9/16/2009 Features of Evolution General Idea Maintain a population of individuals (states / strings / candidate solutions) Each individual is evaluated using a fitness function, i.e. an objective function. The fitness scores force individuals to compete for the privilege of survival and reproduction. Generate a sequence of generations: Generate a sequence of generations: From the current generation, select pairs of individuals (based on fitness) to generate new individuals, using crossover. Introduce some noise through random mutations. Hope that average and maximum fitness (i.e. value to be optimized) increases over time. High degree of parallelism (many individuals in a population) New individuals ("next state / neighboring states"): Derived by combining "parents" ("crossover operation") Random changes also happen ("mutations") Selection of next generation: Based on survival of the fittest: the most fit parents tend to be used to generate new individuals. GA: Highlevel Algorithm 24748552 32752411 24415124 32543213 (a) Genetic algorithms as search spaces. Genetic algorithms are local heuristic search algorithms. Especially good for problems that have large and poorly understood search 24 31% 32752411 24748552 32752411 24415124 (c) Selection 32748552 24752411 32752124 24415411 (d) CrossOver 32748152 23 29% 20 26% 11 14% (b) 24752411 32252124 24415417 (e) Mutation Genetic algorithms use a randomized parallel beam search to explore the state space. state space You must be able to define a good fitness function, and of course, a good state representation. Initial Population Fitness Function GA (Fitness, Fitness_threshold,p,r,m) P randomly generate p individuals For each i in P, compute Fitness(i) While [maxi Fitness(i)] < Fitness_threshold 1. Probabilistically select (1r)p members of P to add to PS. 2. Probabilistically choose (r p)/2 pairs of individuals from P. 2 Probabilistically choose (rp)/2 pairs of individuals from P i1 , i2 For each pair, apply crossover and add the offspring to PS 3. Mutate mp random members of PS 4. P PS 5. For each i in P, compute Fitness(i) Return the individual in P with the highest fitness. Selecting Most Fit Individuals Individuals are chosen probabilistically for survival and crossover based on fitness proportionate selection: Pr(i ) Fitness (i ) j 1 Fittness (i j ) p Other selection methods include: Tournament Selection: 2 individuals selected at random. With probabiltiy p, the more fit of the two is selected. With probability (1p), the less fit is selected. Rank Selection: The individuals are sorted by fitness and the probability of selecting an individual is proportional to its rank in the list. 5 9/16/2009 Binary string representations Individuals are usually represented as a string over a finite alphabet, usually bit strings. Individuals represented can be arbitrarily complex. E.g. each component of the state description is allocated a specific portion of the string, which encodes the values that are acceptable. Bit string representation allows crossover operation to change multiple values in the state description. Crossover and mutation can also produce previously unseen values. 8queens State Representation option 1: 86427531 option 2: 111 101 011 001 110 100 010 000 Mutation Mutation: randomly toggle one bit Individual A: 1 0 0 1 0 1 1 1 0 1 Individual A': 1 0 0 0 0 1 1 1 0 1 I di id l A' 1 0 0 0 0 1 1 1 0 1 Mutation The mutation operator introduces random variations, allowing solutions to jump to different parts of the search space. What happens if the mutation rate is too low? What happens if the mutation rate is too high? A common strategy is to use a high mutation rate when search begins but to decrease the mutation rate as the search progresses. Crossover Example Flour Another Example World championship chocolate chip cookie recipe. Sugar 1 3 1 2.5 2.5 1.5 1.5 Salt 2 1 1 2.5 1.5 2 1.5 Chips 16 14 8 16 10 8 8 Vanilla 1 2 1 2 1 2 2 Fitness + = 1 2 3 4 5 6 7 4 4.5 2 2.2 4.1 8 3 Generation 1 6 9/16/2009 Crossover Operators Singlepoint crossover: Parent A: 1 0 0 1 0 1 1 1 0 1 Parent B: 0 1 0 1 1 1 0 1 1 0 P B 0 1 0 1 1 1 0 1 1 0 Child AB: 1 0 0 1 0 1 0 1 1 0 Child BA: 0 1 0 1 1 1 1 1 0 1 Uniform Crossover Uniform crossover: Parent A: 1 0 0 1 0 1 1 1 0 1 Parent B: 0 1 0 1 1 1 0 1 1 0 P B 0 1 0 1 1 1 0 1 1 0 Child AB: 1 1 0 1 1 1 1 1 0 1 Child BA: 0 0 0 1 0 1 0 1 1 0 Remarks on GA's In practice, several 100 to 1000's of strings. Crowding can occur when an individual that is much more fit than others reproduces like crazy, which reduces diversity in the population. In general, GA's are highly sensitive to the representation. Value of crossover difficult to determine (so far) (local search). Genetic Programming In Genetic Programming, programs are evolved instead of bit strings. Programs are represented by trees. For example: + sin( x ) x 2 y sin + x ^ x 2 y Local Search Summary Surprisingly efficient search method. Wide range of applications. any type of optimization / search task Handles search spaces that are too large p g (e.g., 101000) for systematic search Often best available algorithm when lack of global information. Formal properties remain largely elusive. Research area will most likely continue to thrive. 7 ...
View Full Document

This note was uploaded on 05/30/2010 for the course CS 4700 taught by Professor Joachims during the Fall '07 term at Cornell.

Ask a homework question - tutors are online