lec11boo - CSC2515 Fall 2008 Introduction to Machine...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CSC2515 Fall 2008 Introduction to Machine Learning Lecture 11a Boosting and Naïve Bayes All lecture slides will be available as .ppt, .ps, & .htm at www.cs.toronto.edu/~hinton Many of the figures are provided by Chris Bishop from his textbook: z Pattern Recognition and Machine Learning z
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
A commonsense way to use limited computational resources First train a model on all of the data – Lets assume it get the great majority of the cases right. Then train another model on all the cases the first model got wrong plus an equal number that it got right. – This focusses the resources on modelling the hard cases. Train a third model focusssing on cases that either or both previous models got wrong. – Then use a simple committee of the three models This is quite effective for learning to recognize handwritten digits, but it is also very heuristic. – Can we give it a theoretical foundation?
Background image of page 2
Making weak learners stronger Suppose you have a weak learning module (a l base classifier z ) that can always get 0.5+epsilon correct when given a two-way classification task
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 10/23/2010.

Page1 / 11

lec11boo - CSC2515 Fall 2008 Introduction to Machine...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online