59 inductive learning what property should h have it

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: aining data. 57 58 Inductive Learning • The decision tree approach is one example of an inductive learning technique: • Suppose that data x is related to output y by a unknown function y = f(x) • Suppose that we have observed training examples {(x1,y1),..,(xn,yn)} • Inductive learning problem: Recover a function h (the “hypothesis”) such that h(x) ≈ f(x) • y = h(x) predicts y from the input data x • The challenge: The hypothesis space (the space of all hypothesis h of a given form; for example the space of all of the possible decision trees for a set of M attributes) is huge + many different hypotheses may agree with the training data. 59 Inductive Learning • What property should h have? • It should agree with the training data… 60 Inductive Learning Inductive Learning Two stupid hypotheses that fit the training data perfectly • What property should h have? • It should agree with the training data… • But that can lead to arbitrarily complex hypotheses and there are many of them; which one should we choose?… 61 • Problems with a complex hypothesis: • It leads to completely wrong prediction on new test data… • It does not generalize beyond the training data…it overfits the training data 62 Inductive Learning Inductive Learning • Simplicity principle (Occam’s razor): “entities are not to be multiplied beyond necessity”...
View Full Document

This note was uploaded on 11/03/2010 for the course UNIVERSITY CS6375 taught by Professor Vicentng during the Fall '10 term at University of Texas at Dallas, Richardson.

Ask a homework question - tutors are online