# Adaboost_Tutorial - Adaboost Derek Hoiem March 31, 2004...

This preview shows pages 1–16. Sign up to view the full content.

Adaboost Derek Hoiem March 31, 2004

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Outline Background Adaboost Algorithm Theory/Interpretations Practical Issues Face detection experiments
What’s So Good About Adaboost Improves classification accuracy Can be used with many different classifiers Commonly used in many areas Simple to implement Not prone to overfitting

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
A Brief History Bootstrapping Bagging Boosting (Schapire 1989) Adaboost (Schapire 1995)
Bootstrap Estimation Repeatedly draw n samples from D For each set of samples, estimate a statistic The bootstrap estimate is the mean of the individual estimates Used to estimate a statistic (parameter) and its variance

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Bagging - Aggregate Bootstrapping For i = 1 . . M Draw n * <n samples from D with replacement Learn classifier C i Final classifier is a vote of C 1 .. C M Increases classifier stability/reduces variance
Boosting (Schapire 1989) Randomly select n 1 < n samples from D without replacement to obtain D 1 Train weak learner C 1 Select n 2 < n samples from D with half of the samples misclassified by C 1 to obtain D 2 Train weak learner C 2 Select all samples from D that C 1 and C 2 disagree on Train weak learner C 3 Final classifier is vote of weak learners

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Adaboost - Adaptive Boosting Instead of sampling, re-weight Previous weak learner has only 50% accuracy over new distribution Can be used to learn weak classifiers Final classification based on weighted vote of weak classifiers
Adaboost Terms Learner = Hypothesis = Classifier Weak Learner: < 50% error over any distribution Strong Classifier: thresholded linear combination of weak learner outputs

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Discrete Adaboost (DiscreteAB) (Friedman’s wording)
Discrete Adaboost (DiscreteAB) (Freund and Schapire’s wording)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Adaboost with Confidence Weighted Predictions (RealAB)
Comparison 2 Node Trees

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Bound on Training Error (Schapire)
Finding a weak hypothesis Train classifier (as usual) on weighted training data

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## Adaboost_Tutorial - Adaboost Derek Hoiem March 31, 2004...

This preview shows document pages 1 - 16. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online