{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

final-sp09

# final-sp09 - CS 188 Spring 2009 INSTRUCTIONS You have 3...

This preview shows pages 1–5. Sign up to view the full content.

CS 188 Introduction to Spring 2009 Artificial Intelligence Final Exam INSTRUCTIONS You have 3 hours. The exam is closed book, closed notes except a two-page crib sheet, double-sided. Please use non-programmable calculators only. Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation. All short answer sections can be successfully answered in a few sentences at most. Last Name First Name SID Login GSI Section Time All the work on this exam is my own. (please sign) For staff use only Q. 1 Q. 2 Q. 3 Q. 4 Q. 5 Q. 6 Total /16 /15 /20 /10 /21 /18 /100

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
NAME: 3 1. (12 points) Linear Naive Bayes Recall that a Naive Bayes classifier with observed random variables F i ( i = 1 , . . . , n ) and the query variable Y uses the classification rule: arg max y P ( y | f 1 , . . . , f n ) = arg max y P ( y ) n i =1 P ( f i | y ) And a linear classifier (for example, perceptron) uses the classification rule: arg max y n i =0 w y,i · f i where f 0 = 1 is a bias feature (a) (8 pt) Consider a Naive Bayes classifier with binary-valued features, i.e. f i [0 , 1]. Prove that it is also a linear classifier, by defining weights w y,i (for i = 0 , . . . , n ) such that both decision rules above are equivalent. The weights should be expressed in terms of the Naive Bayes probabilities — P ( y ) and P ( f i | y ). You may assume that all the Naive Bayes probabilities are non-zero. (b) (4 pt) For the training set below with binary features F 1 and F 2 and label Y , either name a smoothing method that would estimate a naive Bayes model that would correctly classify all training set data, or state that it is impossible (i.e., there is no smoothing method that would give appropriate probabilities). F 1 F 2 Y 0 0 0 0 1 1 1 0 1 1 1 0

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 2. (15 points) Blind Connect Three In Connect Three, players alternate dropping pieces into one of four columns. A player wins by having three consecutive pieces of their color either horizontally, vertically, or diagonally. Assume columns have infinite height. A dropped piece always occupies the lowest open space in that column. You are playing the game blind-folded against a random opponent. You can’t see the opponent’s moves. However, you can always hear the opponent’s piece sliding into place. When the opponent drops his piece along the edge of the board, it makes a zing sound; and when he drops it in one of the center two columns, it makes a zang sound. On the other hand, you know exactly the move which you have made. When a player
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern