hw1_Solution_Part I

hw1_Solution_Part I - CS 6375 Homework 1 Chenxi Zeng, UTD...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
CS 6375 Homework 1 Chenxi Zeng, UTD ID: 11124236 Part I: 1. Let 0 S be the original data table given in the question, then Entropy( 0 S ) = 1. Let i I (i=1,2,3,4) be the information gain if we split by Attr i, i.e., i I =IG( 0 S |Attr i). Stage 1: We have 1 I = 2 I =1-1=0, 3 I =1-( 3 0.918 4 × )=0.312, 4 I =1-( 5 3 0.971 0.918 8 8 × + × )=0.049. So 3 I is the maximum information gain for the first splitting, and we can get the table below: Attr 3=c Attr 1 Attr 2 Attr 4 class a 1 -1 1 b 0 -1 1 a 0 1 1 b 1 1 1 b 0 1 2 b 1 -1 2 Attr 3=d (done) a 0 - 1 2 a 0 -1 2 1 S : red part We have Entropy( 1 S )=0.918. Let 3 i I (i=1,2,4) be the information gain if we split by Attr I, i.e., 3 i I =IG( 1 S |Attr i). Stage 2: We have 31 I =0.918-0.667=0.251, 32 I = 34 I =0.918-0.918=0. So 31 I is the maximum information gain for the second splitting, and we can get the table below:
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Attr 3=c Attr 1=a (done) Attr 2 Attr 4 class 1 -1 1 0 1 1 Attr 1=b 0 -1 1 1 1 1 0 1 2 1 -1 2 Attr 3=d (done) a 0 - 1 2 a 0 -1 2 2 S : red part We have Entropy(
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

hw1_Solution_Part I - CS 6375 Homework 1 Chenxi Zeng, UTD...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online