Stat841f09 - Wiki Course Notes

# This paper introduces the boosting algorithm adaboost

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ed classifiers will be used as the boosted classifier. The better each generated classifier is the more its weight is in the final classifier. Paper about Booting (http://www.site.uottawa.ca/~stan/csi5387/boost- tut- ppr.pdf) : Boosting is a general method for improving the accuracy of any given learning algorithm. This paper introduces the boosting algorithm AdaBoost, and explains the underlying theory of boosting, including an explanation of why boosting often does not suffer from overfitting as well as boosting’s relationship to support- vector machines. Finally, this paper gives some examples of applications of boosting recently. AdaBoos t Algorithm (http://e n.wikipe dia.org/wiki/AdaBoos t) Let's first look at the original boosting algorithm: 1. Set all the weights of all points equal 2. For 1. Find where we have points. that minimizes the weighted error where 2. Let 3. Update the weights: 3. The final classifier is wikicour senote.com/w/index.php?title= Stat841&amp;pr intable= yes 72/74 10/09/2013 Stat841 - Wiki Cour se Notes When applying boosting to different classifiers, the first step in 2 may be different since we can define the most proper misclassification error according to the problem. However, the major idea is to give higher weight to misclassified examples, which does not change across classifiers. Boosting works very well in practice, and there are a lot of research and published works on why it works this well. One possible explanation is that it actually maximizes the margin of classifiers. We can see that in AdaBoost if training points are accurately classified, then their weights of being used in the next classifier is kept unchanged, while if points are not accurately classified, their weights of being used again is raised. At a result easier examples get classified in the very first few classifiers and hard examples are learned later with increasing emphasis. Finally, all the classifiers are combined through a majority vote, which is also weighted by their accuracy, taking consideration of both th...
View Full Document

## This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online