Adaboost_Tutorial

Adaboost_Tutorial - Adaboost Derek Hoiem March 31, 2004...

Info iconThis preview shows pages 1–16. Sign up to view the full content.

View Full Document Right Arrow Icon
Adaboost Derek Hoiem March 31, 2004
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Outline Background Adaboost Algorithm Theory/Interpretations Practical Issues Face detection experiments
Background image of page 2
What’s So Good About Adaboost Improves classification accuracy Can be used with many different classifiers Commonly used in many areas Simple to implement Not prone to overfitting
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
A Brief History Bootstrapping Bagging Boosting (Schapire 1989) Adaboost (Schapire 1995)
Background image of page 4
Bootstrap Estimation Repeatedly draw n samples from D For each set of samples, estimate a statistic The bootstrap estimate is the mean of the individual estimates Used to estimate a statistic (parameter) and its variance
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Bagging - Aggregate Bootstrapping For i = 1 . . M Draw n * <n samples from D with replacement Learn classifier C i Final classifier is a vote of C 1 .. C M Increases classifier stability/reduces variance
Background image of page 6
Boosting (Schapire 1989) Randomly select n 1 < n samples from D without replacement to obtain D 1 Train weak learner C 1 Select n 2 < n samples from D with half of the samples misclassified by C 1 to obtain D 2 Train weak learner C 2 Select all samples from D that C 1 and C 2 disagree on Train weak learner C 3 Final classifier is vote of weak learners
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Adaboost - Adaptive Boosting Instead of sampling, re-weight Previous weak learner has only 50% accuracy over new distribution Can be used to learn weak classifiers Final classification based on weighted vote of weak classifiers
Background image of page 8
Adaboost Terms Learner = Hypothesis = Classifier Weak Learner: < 50% error over any distribution Strong Classifier: thresholded linear combination of weak learner outputs
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Discrete Adaboost (DiscreteAB) (Friedman’s wording)
Background image of page 10
Discrete Adaboost (DiscreteAB) (Freund and Schapire’s wording)
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Adaboost with Confidence Weighted Predictions (RealAB)
Background image of page 12
Comparison 2 Node Trees
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Bound on Training Error (Schapire)
Background image of page 14
Finding a weak hypothesis Train classifier (as usual) on weighted training data
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 16
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 46

Adaboost_Tutorial - Adaboost Derek Hoiem March 31, 2004...

This preview shows document pages 1 - 16. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online