Unformatted text preview: CS 188 Fall 1993 Introduction to AI Stuart Russell Final examination You have 2 hours 50 min. The exam is openbook, opennotes. There a total of 100 points available. Write your answers in blue books. Hand them all in. Several of the questions on this exam are true/false or multiple choice. In all the multiple choice questions more than one of the choices may be correct. Give all correct answers. Each multiple choice question will be graded as if it consisted of a set of true/false questions, one for each possible answer. 1. (10 pts.) De nitions Provide brief, precise de nitions of the following:
(a) (b) (c) (d) (e) Decisiontheoretic agent Intrinsic property Clause (in CNF) Conditional probability Optical ow 2. (10 pts.) Knowledge representation 3. (10 pts.) Logical Inference Represent the following sentences in rstorder logic, using one consistent ontology. For each predicate, function and constant symbol you use, say what it means in English. (a) (2) Water is a liquid and Ty Nant is a kind of water. (b) (3) Each of Earth's oceans contains some water. (c) (2) A pint of water weighs a pound. (d) (3) Water has a boiling point. (Hint: don't say HasBoilingPoint  think what happens to a piece of water above a certain temperature.)
Multiple choice: Given the following premises:
8z :Q(C z ) 8x y P (f (x) f (y)) ) P (f (y) 8y Q(y A) ) P (y y) 8y f (x)) Q(A A) Q(f (B ) y) _ Q(C y) which of the following conclusions is logically justi ed? (a) (1) P (A A) (b) (2) P (f (A) f (A)) (c) (3) P (f (B ) f (B )) (d) (4) The empty clause (i.e., the KB is inconsistent). 1 4. (12 pts.) Uninformed search
I
S Consider the search spaces shown in the following gure, in each, G marks a goal state. Successors are generated lefttoright, and each arc is unidirectional (downward): For each search space (I, II, III), list the MOST e cient
II
S S III G G G 5. (11 pts.) Heuristic search and LEAST e cient algorithm in terms of absolute computation time (NOT O() time). (a) Breadth rst (b) Depth rst (c) Depth rst iterative deepening (d) Depth rst with repeatedstate checking You may assume that the time for a repeatedstate check is small compared to the time to generate a set of successors. 6. (12 pts.) Probability and decision theory In this question we will consider the problem of colouring an undirected graph (a graph consists of a set of vertices V joined by edges E). Two vertices are adjacent if there is an edge linking them. A colouring of a graph assigns a colour to each vertex. It is valid if no two adjacent vertices are the same colour. In this case the objective is to nd the minimal valid colouring  the one using the least number of di erent colours. (a) (4) De ne this problem formally as a search problem suitable for solution by a standard search algorithm such as breadth rst or A*. (b) (2) Give a nontrivial, admissible heuristic function for this problem that runs in at most linear time in the size of the graph (that is, O(jV j + jE j)). (c) (2) Show that your heuristic is admissible. (d) (3) Explain how you would set the problem up for simulated annealing. Does your energy function have local minima? Let the current world state be denoted by a random variable W0 , and the current percept be S0 . Let A be a decision variable whose value is the action the agent will do in the current state, and let S1 and W1 be the resulting percept and world state. Let U be the utility of the resulting state. Assume initially that the sensor's operation is una ected by any action the agent may take. (a) (3) Draw the in uence diagram for this decision problem. (b) (4) Write the name (i.e., P(: : : j : : :)) of each distinct conditional probability distribution required for this in uence diagram (note that two of the ve will usually be identical). Explain in words what each distribution represents. (c) (3) Suppose that the world is fully accessible. What does this tell you about the conditional probability distribution at the nodes S0 and S1 ? (d) (2) Suppose that for some values of A the sensor is likely to become damaged. Explain, in words or by redrawing, how the in uence diagram in (a) should be changed. In this question we will consider neural nets with inputs in the range 0 1] and with g a step function. A network is de ned by the weights on the links and the threshold value of g at each node. (a) (2) Draw a network to represent the majority function (at least half the inputs high) for 4 input nodes. (b) (3) Draw a network to represent the \exactly two out of three" function for three inputs. 2 7. (12+3 pts.) Neural networks (c) (3) Describe why and how you might apply simulated annealing to train a neural network. (d) (3) Suppose you are training a neural network in a genuinely nondeterministic domain. You give it 100 copies of the same example, 75 of which are positive and 25 of which are negative. Using the standard error function E = 1 (T e ; Oe)2 2 X
e 8. (15 pts.) Natural language where the sum is taken over the examples in the training set and where T e is the correct value for the example and Oe is the actual output, calculate the error when the network converges to output 1.0 and output 0.75 respectively. Comment on the result. For what sort of classi cation task might an output of 1.0 be appropriate given this data? (e) (3) (Extra credit) Can you design an error function such that the network will output a suitable probability value when trained on data of this sort? Consider the following contextfree grammar: S !NP VP S !if S then S NP !Determiner Modi er Noun j Pronoun Determiner !a j the j three Pronoun !I j you j me j he j him Modi er !Adjective* j Noun* Adjective !large j small Noun !tennis j racquet j head j kindergarten j table VP !Verb NP Verb !clean j break (a) (3) Multiple choice: Which of the following sentences are generated by the grammar? i. if if you break me then I break you then I clean the small tennis racquet ii. the large tennis tennis clean me me me iii. the tennis kindergarten large head clean you (b) (2) Write down at least one other English sentence generated by the grammar above. It should be signi cantly di erent from the above sentences, and should be at least six words long. Do not use any of the words from the above sentences instead, add grammatical rules of your own, of the form (grammatical category) !(speci c word)for instance, Noun ! bottle. (c) (2) Show the parse tree for your sentence. (d) (2) Fix the rule \S !if S then S" so that it disallows the generation of consecutive ifs but still allows nested conditionals. (Hint: think about what the \condition" part of a conditional sentence can be.) (e) (4) (openended) This grammar allows \nounnoun modi cation" (e.g., \tennis racquet" or \table tennis"). Not all noun strings in English are considered grammatical for example, you can say \tennis racquet shop" but not \electron dictionary elbow." Is this a syntactic or semantic distinction? How might it be handled by a de nite clause grammar? (f) (2) In German, nounnoun pairs (triplets etc.) are run together into one word (this also occurs in English very occasionally, as in \bathroom"). Would you need to change a natural language processing system to handle this? If so, how? 9. (8 pts.) Vision/NLP (Openended) What do vision and spoken language understanding1 have in common? Consider both lowlevel and highlevel aspects of each task, and try to construct a detailed analogy. Are there fundamental di erences between the two tasks, beyond the physical di erences in the two kinds of signals? 1 \Understanding" means understanding the content not just recognizing the words 3 ...
View
Full Document
 Spring '08
 Staff
 Computer Science, Conditional Probability, Probability, Grammar, neural network

Click to edit the document details