hmk2solstud3 - David Hayman HW2: Part 1 decision tree and...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
David Hayman HW2: Part 1 –decision tree and explanation on last two pages Training data: Age Income Student Credit Rating Buys Computer <=30 High No Fair No <=30 High No Excellent No 31…40 High No Fair Yes >40 Medium No Fair Yes >40 Low Yes Fair Yes >40 Low Yes Excellent No 31…40 Low Yes Excellent Yes <=30 Medium No Fair No <=30 Low Yes Fair Yes >40 Medium Yes Fair Yes <=30 Medium Yes Excellent Yes 31…40 Medium No Excellent Yes 31…40 High Yes Fair Yes >40 Medium No Excellent No Problem 1: 1: “one with general majority voting , as defined in lecture notes, i.e. majority voting at any node of your choice. Use CREDIT RATING as the root attribute, and nodes attributes of your choice” Credit Rating Fair Excellent Age Income Student Class Age Income Student Class <=30 High No No <=30 High No No 31…40 High No Yes >40 Low Yes No >40 Medium No Yes 31…40 Low Yes Yes >40 Low Yes Yes <=30 Medium Yes Yes <=30 Medium No No 31…40 Medium No Yes <=30 Low Yes Yes >40 Medium No No >40 Medium Yes Yes 31…40 High Yes Yes
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Credit Rating Fair Excellent Class = yes Student No Yes Age Income Class Age Income Class <=30 High No >40 Low No 31…40 Medium Yes 31…40 Low Yes >40 Medium No <=30 Medium Yes Credit Rating Fair Excellent Class = yes Student No Yes Class = no Class = yes Credit Rating Fair Excellent Buys = yes Student No Yes Buys = no Buys = yes
Background image of page 2
2. one without general majority voting; i.e use ID3 algorithm (without Information Gain). Use INCOME as root attribute, and nodes attributes of your choice; Income High Medium Low Age Student Credit Rating *Buys Computer <=30 No Fair No <=30 No Excellent No 31…40 No Fair Yes 31…40 Yes Fair Yes The first tree will have the list passed recursively. That is, all unused attributes get passed to each child node when the algorithm is called to generate each sub tree, and attributes used by the children of one child won’t affect children of another child. Age is an obvious split for High income as it requires no additional steps, and a shallow tree is
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/25/2012 for the course CSE 352 taught by Professor Wasilewska,a during the Fall '08 term at SUNY Stony Brook.

Page1 / 9

hmk2solstud3 - David Hayman HW2: Part 1 decision tree and...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online