t18 - machine learning, and has given rise to a number of...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
3.1 Learning in Feature Space The complexity of the target function to be learned depends on the way it is represented, and the difficulty of the learning task can vary accordingly. Ideally a representation that matches the specific learning problem should be chosen. So one common preprocessing strategy in machine learning involves changing the representation of the data: This step is equivalent to mapping the input space X into a new space, F = {f(x)|x ? X. Example 3.1 Consider the target function giving Newton's law of gravitation, expressing the gravitational force between two bodies with masses m1,m2 and separation r. This law is expressed in terms of the observable quantities, mass and distance. A linear machine such as those described in Chapter 2 could not represent it as written, but a simple change of coordinates gives the representation which could be learned by a linear machine. The fact that simply mapping the data into another space can greatly simplify the task has been known for a long time in
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: machine learning, and has given rise to a number of techniques for selecting the best representation of data. The quantities introduced to describe the data are usually called features, while the original quantities are sometimes called attributes. The task of choosing the most suitable representation is known as feature selection. The space X is referred to as the input space, while F = {f(x): |x ? X} is called the feature space. Figure 3.1 shows an example of a feature mapping from a two dimensional input space to a two dimensional feature space, where the data cannot be separated by a linear function in the input space, but can be in the feature space. The aim of this chapter is to show how such mappings can be made into very high dimensional spaces where linear separation becomes much easier....
View Full Document

Ask a homework question - tutors are online