Lect20-Boost-SVM (1)

Lect20-Boost-SVM (1) - . . . . . . DATA MINING Susan Holmes...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: . . . . . . DATA MINING Susan Holmes Stats202 Lecture 20 Fall 2010 ABabcdfghiejkl . . . . . . Special Announcements I Do not update your version of R before the end of the quarter. I All requests should be sent to stats202-aut1011-staff@lists.stanford.edu . I A new homework is up: due next Thursday, contains part of the data for the competition. I Kaggle: data mining competition, details at the end of the lecture. I Solutions to midterm, grades will be in tomorrow, we will hand out the graded exams in class on Friday. I Feedback form, if you have not lled it in, please nd it in the Coursework handouts folder and fax it to 650 725 8977 (anonymous) or email it back to the stats202-aut1011-staff@lists.stanford.edu list: Thank you. . . . . . . Last Time: Ensemble Methods with R Examples I Bagging: Bootstrap Aggregation. I Boosting : combining weak learners. I Random Forests. . . . . . . Bootstrapping P ( x 1 is in the bootstrap resample ) = 1 (1 1 n ) n (1 1 n ) n = exp ( nlog (1 1 n )) expn ( 1 n ) = exp ( 1) = 1 e = 0 . 36788 OOB=out of the bag (not included in the Bootstrap resample). OOB prediction is determined by a majority rule vote of all trees whose training set did not contain that observation. . . . . . . Example of Boosting library(rpart) library(mlbench) library(adabag) data(BreastCancer) l <- length(BreastCancer[,1]) sub <- sample(1:l,2*l/3) train=BreastCancer[sub,-1] BC.rpart <- rpart(Class~.,data=BreastCancer[sub,-1], maxdepth=3) BC.rpart.pred <- predict(BC.rpart,newdata=BreastCancer[-sub,-1],ty tb <-table(BC.rpart.pred,BreastCancer$Class[-sub]) error.rpart <- 1-(sum(diag(tb))/sum(tb)) . . . . . . Example of Boosting tb : BC.rpart.pred benign malignant benign 131 3 malignant 13 86 error.rpart [1] 0.06866953 train2=train[which(!is.na(train$Class)),] BC.adaboost =adaboost.M1(Class~.,data=train2, mfinal=25, maxdepth=3) BC.adaboost.pred =predict.boosting(BC.adaboost, newdata=BreastCancer[-sub,-1]) BC.adaboost.pred[-1] $confusion Observed Class Predicted Class benign malignant benign 147 4 malignant 3 80 $error[1] 0.02991453 BC.adaboost$importance Cl.thickness Cell.size Cell.shape Marg.adhesion Epith.c 15.789474 10.526316 6.140351 5.263158 5.26315 Bare.nuclei Bl.cromatin Normal.nucleoli Mitoses 23.684211 7.017544 20.175439 6.140351 . . . . . ....
View Full Document

Page1 / 21

Lect20-Boost-SVM (1) - . . . . . . DATA MINING Susan Holmes...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online