13-PracticalMachineLearning

Points sample bootstrap sample train lter unused data

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: for many applica%ons Very popular method Boos%ng “Boos%ng is one of the most powerful learning ideas introduced in the last twenty years. Has%e et al.,”The Elements of Sta%s%cal Learning: Data Mining, Inference, and Predic%on”, Springer (2009) Adaboost x2 x1 AdaBoost •  Ini%alize weights for data points •  For each itera%on: –  Fit classifier to training data –  Compute weighted classifica%on error –  Compute weight for classifier from the error –  Update weights for data points •  Final classifier is weighted sum of all single classifiers AdaBoost Has%e et al.,”The Elements of Sta%s%cal Learning: Data Mining, Inference, and Predic%on”, Springer (2009) AdaBoost AdaBoost •  Introduced by Freund and Schapire in 1995 •  Worked great, nobody understood why! •  Then five...
View Full Document

Ask a homework question - tutors are online