{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

2.1.Decision Tree Learning - Aims 11s1 COMP9417 Machine...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
11s1: COMP9417 Machine Learning and Data Mining Decision Tree Learning March 8, 2011 Acknowledgement: Material derived from slides by: Tom M. Mitchell, http://www-2.cs.cmu.edu/~tom/mlbook.html Andrew W. Moore, http://www.cs.cmu.edu/~awm/tutorials and Eibe Frank, http://www.cs.waikato.ac.nz/ml/weka/ Aims This lecture will enable you to describe decision tree learning, the use of entropy and the problem of overfitting. Following it you should be able to: define the decision tree representation list representation properties of data and models for which decision trees are appropriate reproduce the basic top-down algorithm for decision tree induction (TDIDT) define entropy in the context of learning a Boolean classifier from examples COMP9417: March 8, 2011 Decision Tree Learning: Slide 1 Aims describe the inductive bias of the basic TDIDT algorithm define overfitting of a training set by a hypothesis describe developments of the basic TDIDT algorithm: pruning, rule generation, numerical attributes, many-valued attributes, costs, missing values [Recommended reading: Mitchell, Chapter 3] [Recommended exercises: 3.1, 3.2, 3.4(a,b)] COMP9417: March 8, 2011 Decision Tree Learning: Slide 2 Introduction Decision trees are the single most popular data mining tool Easy to understand Easy to implement Easy to use Computationally cheap There are some drawbacks, though ! (such as overfitting) They do classification : predict a categorical output from categorical and/or real inputs COMP9417: March 8, 2011 Decision Tree Learning: Slide 3
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Decision Tree for PlayTennis Outlook Overcast Humidity Normal High No Yes Wind Strong Weak No Yes Yes Rain Sunny COMP9417: March 8, 2011 Decision Tree Learning: Slide 4 A Tree to Predict C-Section Risk Learned from medical records of 1000 women Negative examples are C-sections [833+,167-] .83+ .17- Fetal_Presentation = 1: [822+,116-] .88+ .12- | Previous_Csection = 0: [767+,81-] .90+ .10- | | Primiparous = 0: [399+,13-] .97+ .03- | | Primiparous = 1: [368+,68-] .84+ .16- | | | Fetal_Distress = 0: [334+,47-] .88+ .12- | | | | Birth_Weight < 3349: [201+,10.6-] .95+ .05- | | | | Birth_Weight >= 3349: [133+,36.4-] .78+ .22- | | | Fetal_Distress = 1: [34+,21-] .62+ .38- | Previous_Csection = 1: [55+,35-] .61+ .39- Fetal_Presentation = 2: [3+,29-] .11+ .89- Fetal_Presentation = 3: [8+,22-] .27+ .73- COMP9417: March 8, 2011 Decision Tree Learning: Slide 5 Decision Trees Decision tree representation: Each internal node tests an attribute Each branch corresponds to attribute value Each leaf node assigns a classification How would we represent: , , XOR ( A B ) ( C ¬ D E ) M of N COMP9417: March 8, 2011 Decision Tree Learning: Slide 6 Decision Trees X Y X = t: | Y = t: true | Y = f: no X = f: no X Y X = t: true X = f: | Y = t: true | Y = f: no COMP9417: March 8, 2011 Decision Tree Learning: Slide 7
Background image of page 2
Decision Trees 2 of 3 X = t: | Y = t: true | Y = f: | | Z = t: true | | Z = f: false X = f: | Y = t: | | Z = t: true | | Z = f: false | Y = f: false So in general decision trees represent a disjunction of conjunctions of constraints on the attributes values of instances.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}