04-local_search

04-local_search - Scaling Up Foundations of Artificial...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Foundations of Artificial Intelligence Local Search CS472 – Fall 2007 Filip Radlinski Scaling Up So far, we have considered methods that systematically explore the full search space, possibly using principled pruning (A* etc.). The current best such algorithms (RBFS / SMA*) can handle search spaces of up to 10 100 states ~ 500 binary valued variables. But search spaces for some real-world problems might be much bigger - e.g. 10 30,000 states. Here, a completely different kind of search is needed. Local Search Methods Example QuickTime™ and a TIFF (LZW) decompressor are needed to see this picture. QuickTime™ and a TIFF (LZW) decompressor are needed to see this picture. Optimization Problems We're interested in the Goal State - not in how to get there. Optimization Problem: - State: vector of variables - Objective Function: f : state - Goal: find state that maximizes or minimizes the objective function Examples: VLSI layout, job scheduling, map coloring, N-Queens. Example Local Search Methods Applicable to optimization problems. Basic idea: - use a single current state - don't save paths followed - generally move only to successors/neighbors of that state Generally require a complete state description .
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Hill-Climbing Search function HILL-CLIMBING ( ) returns a solution state inputs: , a problem static: , a node MAKE-NODE(INITIAL-STATE[ ]) loop do a highest-valued succe problem problem current current problem next ssor of if VALUE[next] < VALUE[current] then return end current current current next Current State Evaluation Hill Climbing Pathologies Objective function Shoulder Global Maximum Local Maximum “flat” local maximum State Space Value of current solution Local Maximum Example Improvements to Basic Local Search Issue : How to move more quickly to successively higher plateaus and avoid getting “stuck” local maxima . Idea: Introduce downhill moves (“noise”) to escape from long plateaus (or true local maxima). Strategies: Random-restart hill-climbing => Multiple runs from randomly generated initial states •T a b u s e a r c h Simulated Annealing Genetic Algorithms Variations on Hill-Climbing Random restarts : Simply restart at a new random state after a pre-defined number of steps.
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/19/2008 for the course CS 4700 taught by Professor Joachims during the Fall '07 term at Cornell.

Page1 / 6

04-local_search - Scaling Up Foundations of Artificial...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online