100%(1)1 out of 1 people found this document helpful
This preview shows pages 1–3. Sign up to view the full content.
1Foundations of Artificial IntelligenceLocal SearchCS472 – Fall 2007Filip RadlinskiScaling Up•So far, we have considered methods that systematically explore the full search space, possibly using principledpruning (A* etc.).•The current best such algorithms (RBFS / SMA*) can handle search spaces of up to 10100states →~ 500 binary valued variables.•But search spaces for some real-world problems might be much bigger - e.g. 1030,000 states.•Here, a completely different kind of search is needed. →Local Search MethodsExampleQuickTime™ and aTIFF (LZW) decompressorare needed to see this picture.QuickTime™ and aTIFF (LZW) decompressorare needed to see this picture.Optimization Problems•We're interested in the Goal State - not in how to get there.•Optimization Problem:-State: vector of variables-Objective Function: f: state→-Goal: find state that maximizes or minimizes the objective function•Examples: VLSI layout, job scheduling, map coloring, N-Queens.ℜExampleLocal Search Methods•Applicable to optimization problems.•Basic idea:- use a single current state- don't save paths followed- generally move only to successors/neighbors of that state•Generally require a complete state description.
has intentionally blurred sections.
Sign up to view the full version.