This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS446: Pattern Recognition and Machine Learning Fall 2009 Problem Set 2 Handed Out: September 11, 2009 Due: September 22, 2009 • Feel free to talk to your classmates about the homework. I am more concerned that you learn how to solve the problem than that you demonstrate that you solved it entirely on your own. You should, however, write down your solution yourself. Please try to keep the solution brief and clear. • Please, no handwritten solutions. Be sure your name appears on the top of each page. • Please present your algorithms in both pseudocode and English. That is, give a precise formulation of your algorithm as pseudocode and also explain in one or two concise paragraphs what your algorithm does. Be aware that pseudocode is much simpler and more abstract than real code. Take a look at the textbook pseudocode (e.g. Table 2.5 on page 33) to get an idea about the appropriate level of abstraction. • The homework is due at 4:00 pm on the due date. Email writeup and your code to the TA. Please do NOT hand in a hard copy of your writeup. Please put “ < userid > CS446 hw2 submission” as the subject line of the email when you submit your homework to [email protected] . Put all of your files into a single compressed file with the file name“ < userid >hw2.tgz”. 1. [Representing Boolean Functions  20 points] (Based on Mitchell, exercise 3.1) Give decision trees to represent the following Boolean functions: a. ¬ A ∨ B ∧ C [3 points] b. ( A ∧ ¬ B ) ∨ ¬ ( C ∧ D ) [3 points] c. ( A ∨ B ) ⊕ C ∨ A ⊕ ( ¬ B ∧ C )[4 points] To draw a tree in latex, you can use the qtree package which can be downloaded from the web 1 . Here is an example that uses the qtree package. + A B C 2. [Implementing Decision Trees  80 points] In this programming assignment, you will implement a simple ID3like decision tree learning algorithm and test it on a data set. We will use a data set similar to the one from the Badges Game. You may use the programming language of your choice. The data is available from the course web site in a file called Badges3 . It is given as a list names preceded by a label ’+’ or ’ ’. Altogether there are 158 positive examples and 136 negative examples. 1 http://www.ling.upenn.edu/advice/latex/qtree/ 1 Your Program Your program should perform the items listed below. Please note that your actual implementation of the decision tree algorithm should be independent from the feature extraction mechanism, as we may use it in other assignments. In particular, we may be requiring you to reuse this generic decision tree code for rules extraction and boosting...
View
Full
Document
This note was uploaded on 04/23/2010 for the course COMPUTER S cs 446 taught by Professor Ahuja during the Fall '08 term at University of Illinois at Urbana–Champaign.
 Fall '08
 ahuja
 Machine Learning

Click to edit the document details