ML-CB

ML-CB - Mach Learn (2006) 63:183205 DOI...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Mach Learn (2006) 63:183205 DOI 10.1007/s10994-006-6266-6 Classification-based objective functions Michael Rimer Tony Martinez Received: 3 June 2005 / Revised: 4 November 2005 / Accepted: 11 November 2005 / Published online: 3 March 2006 Springer Science + Business Media, LLC 2006 Abstract Backpropagation, similar to most learning algorithms that can form complex decision surfaces, is prone to overfitting. This work presents classification-based objec- tive functions, an approach to training artificial neural networks on classification problems. Classification-based learning attempts to guide the network directly to correct pattern classi- fication rather than using common error minimization heuristics, such as sum-squared error (SSE) and cross-entropy (CE), that do not explicitly minimize classification error. CB1 is presented here as a novel objective function for learning classification problems. It seeks to directly minimize classification error by backpropagating error only on misclassified patterns from culprit output nodes. CB1 discourages weight saturation and overfitting and achieves higher accuracy on classification problems than optimizing SSE or CE. Experiments on a large OCR data set have shown CB1 to significantly increase generalization accuracy over SSE or CE optimization, from 97.86% and 98.10%, respectively, to 99.11%. Comparable results are achieved over several data sets from the UC Irvine Machine Learning Database Repository, with an average increase in accuracy from 90.7% and 91.3% using optimized SSE and CE networks, respectively, to 92.1% for CB1. Analysis indicates that CB1 per- forms a fundamentally different search of the feature space than optimizing SSE or CE and produces significantly different solutions. Keywords: Neural networks . Backpropagation . Classification . Objective functions Editor: Risto Miikkulainen M. Rimer ( ) T. Martinez Computer Science Department, Brigham Young University, Provo, UT 84602, USA e-mail: mrimer@axon.cs.byu.edu T. Martinez e-mail: martinez@cs.byu.edu Springer 184 Mach Learn (2006) 63:183205 1. Introduction Artificial neural networks have received substantial attention as robust learning models for applications involving classification and function approximation (Rumelhart, Hinton, & Williams, 1985). This work proposes the use of classification-based (CB) objective functions to improve backpropagation, increasing generalization on complex classification tasks. The CB1 algorithm is presented as the main contribution. It is an example of a CB objective function suited to learning classification tasks. CB1 seeks to directly minimize classification error by backpropagating error only on misclassified patterns from output nodes that are responsible for the misclassification. In doing so, it updates the network parameters as little as possible. This technique discourages weight saturation and overfitting and is conducive to higher accuracy in classification problems than optimizing with respect to common error...
View Full Document

Page1 / 23

ML-CB - Mach Learn (2006) 63:183205 DOI...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online