notation a h /? to say that /3 can be derived from a by inference. There is
an alternative notation, a which emphasizes that this is not a sentence,
but rather an inference rule. Whenever something in the knowledge base
matches the pattern above the line,
may of course be many different "real worlds" that have the same truth
values for those symbols. The only requirement to complete the
reconciliation is that each proposition symbol be either true or false in
each world. This is, of course, the basic ontol
to leave the cave; it is effective only when the agent is in the start square.
The agent dies a miserable death if it enters a square containing a pit or
a live wumpus. It is safe (but smelly) to enter a square with a dead
wumpus. The agent's goal is to
reasoning like that in Figure 6.6. Y We want to generate new sentences
that are necessarily true, given that the old sentences are true. This
relation between sentences is called entailment, and mirrors the relation
of on'e fact following from another (Fi
OK 3,4 3,3 p? 3,2 3 1' P! 4,4 4,3 4,2 4,1 (a) (b) Figure 6.4 Two later
stages in the progress of the agent, (a) After the third move, with percept
[Stench, None, None, None, None]. (b) After the fifth move, with
percept [Stench, Breeze, Glitter, None, Non
the syntax and semantics are defined precisely, we can call the language
a logic.3 From the syntax and semantics, we can derive an inference
mechanism for an agent that uses the language. We now explain how this
comes about. First, recall that the semanti
surprising that it does not give us much mileage as a representation
language. First-order logic commits to the representation of worlds in
terms of objects and predicates on objects (i.e., properties of objects or
relations between objects), as well as u
in Chapter 4. Iterative deepening was first used by Slate and Atkin
(1977) in the CHESS 4.5 game-playing program. The textbooks by
Nilsson (1971; 1980) are good general sources of information about
classical search algorithms, although they are now somewh
procedure for prepositional logic based on using the inference rules from
Section 6.4. In fact, a version of this very problem was the first
addressed by Cook (1971) in his theory of NP-completeness. (See also
the appendix on complexity.) Cook showed that
polynomial-time inference procedure HORN SENTENCES exists. This
is the class called Horn sentences.8 A Horn sentence has the form: PI A
P2 A . A Pn => Q where the P, and Q are nonnegated atoms. There are
two important special cases: -First, when Q is the
gold and the wumpus are chosen randomly, with a uniform distribution,
from the squares other than the start square. In addition, each square
other than the start can be a pit, with probability 0.2. In most of the
environments in this class, there is a way
can represent beliefs such as "there is a pit in [2,2] or [3,1]" and "there is
no wumpus in [2,2]," and that can make all the inferences that were
described in the preceding paragraphs. 6.3 REPRESENTATION.
REASONING. AND LOGIC KNOWLEDGE REPRESENTATION
SYN
through the space of numbers may pass through] many states, but the
only one that matters is the goal state, the number 11111. Of course,
from a 1 theoretical point of view, it is easy to run the general search
algorithm and then ignore all of the f path
a single link at a time. 90 Chapter 3. Solving Problems by Searching
3.16 Tests of human intelligence often contain sequence prediction
problems. The aim in such problems is to predict the next member of a
sequence of integers, assuming that the number in
(1969); Deo and Pang (1982) give a more recent survey. For the variant
of the uninformed shortest-paths problem that asks for shortest paths
between all pairs of nodes in a graph, the techniques of dynamic
programming and memoization can be used. For a pr
for any sentence that is entailed. But for many knowledge bases, the
haystack of consequences is infinite, and completeness becomes an
important issue.5 We have said that sound inference is desirable. How is
it achieved? The key to sound inference is to h
detects a breeze in [2,1], so there must be a pit in a neighboring square,
either [2,2] or [3,1]. The notation PI indicates a possible pit. The pit
cannot be in [1,1], because the 2 Or is it wumpi? 156 Chapter 6. Agents
that Reason Logically 1,4 1,3 1,2 O
now know the rules of the wumpus world, but we do not yet have an
idea of how a wumpus world agent should act. An example will clear
this up and will show why a successful agent will need to have some
kind of logical reasoning ability. Figure 6.3(a) shows
an "A" for "And.") V (or). A sentence using V, such as A V (P A Q), is a
disjunction of the disjuncts A and (P A Q). (Historically, the V comes
from the Latin "vel," which means "or." For most people, it is easier to
remember as an upside-down and.) Secti
objects with certain relations between them that do or do not hold.
Special-purpose logics make still further ontological commitments; for
example, temporal logic assumes that the world is ordered by a set of
time points or intervals, and includes built-i
"arrangement" of the world that may or may not be the case. Section 6.3.
Representation, Reasoning, and Logic 159 FIRST VILLAGER: We have
found a witch. May we burn her? ALL: A witch! Burn her! BEDEVERE:
Why do you think she is a witch? SECOND VILLAGER: S
relationships for the four binary connectives. We have said that models
are worlds. One might feel that real worlds are rather messy things on
which to base a formal system. Some authors prefer to think of models
as mathematical objects. In this view, a m
essential 170 Chapter 6. Agents that Reason Logically that a reasoning
system be able to draw conclusions that follow from the premises,
regardless of the world to which the sentences are intended to refer. But
it is a good idea for a reasoning system to
not also make bridges out of stone? ALL: Yes, of course . um . er .
BEDEVERE: Does wood sink in water? ALL: No, no, it floats. Throw
her in the pond. BEDEVERE: Wait. Wait. tell me, what also floats on
water? ALL: Bread? No, no no. Apples . gravy . very sm
artificial agents can also avoid looping. d. Can you think of a real
domain in which step costs are such as to cause looping? 3.6 The
GENERAL-SEARCH algorithm consists of three steps: goal test,
generate, and ordering! function, in that order. It seems a
add some new sentences to the knowledge base, all the sentences
entailed by the original KB are still entailed by the new larger
knowledge base. Formally, we can state the property of monotonicity of
a logic as follows: if KB\ \= a then (KBt U KB2) \= a T
if its knowledge base entails the sentence "[2,2] is OK." In other words,
the inference procedure has to show that the sentence "If KB is true then
[2,2] is OK" is a valid sentence. If it is valid, then it does not matter that
the computer does not know t
system, and Pis a long description of the eventual departure of Pluto
from the system. 6 In these examples, we are assuming that words like
"if." "then," "every," "or" and "not" are part of the standard syntax of the
language, and thus are not open to var
represented on the printed page, but the real representation is inside the
computer: each sentence is implemented by a physical configuration or
physical property of some part of the agent. For now, think of this as
being a physical pattern of electrons i