Yao, Liu_1996_Ensemble structure of evolutionary artificial neural networks

Yao, Liu_1996_Ensemble structure of evolutionary artificial neural networks

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Xin Yao and Yong Liu Computational Intelligence Group, School of Computer Science University College, The University of New South Wales Australian Defence Force Academy, Canberra, ACT, Australia 2600 Email: [email protected], W WW: ht tp: //www. cs.adfa.oz.au/ - xin Abstract-Evolutionary artificial neural networks (EANNs) refer to a special class of artificial neural networks (ANNs) in which evolution is another fundamental form of adapta- tion in addition to learning. Evolution can be introduced at various levels of ANNs. It can be used to evolve weights, architectures, and learning parameters and rules. This pa- per is concerned with the evolution of ANN architectures, where an evolutionary algorithm is used to evolve a pop- ulation of ANNs. The current practice in evolving ANNs is to choose the best ANN in the last population as the fi- nal result. This paper proposes a novel approach to form the final result by combining all the individuals in the last generation in order to make best use of all the information contained in the whole population. This approach regards a population of ANNs as an ensemble of ANNs and use a method to combine them. We have used four simple meth- ods in our computational studies. The first is the majority voting method. The second and third are linear combina- tion methods over the whole population. The fourth is a linear combination method over a subset of the whole pop- ulation. The near optimal subset is obtained by a genetic algorithm search. Our experiments have shown that all four methods have produced better results than those produced by the single best individual. I. INTRODUCTION EANNs have been studied widely in recent years [l; 2; 31. They provide not only an automatic method to develop ANNs, but also an approach to study evolution and learn- ing in the same framework. This paper is mainly concerned with the evolution of ANN’S architectures and weights (in- cluding baises), where an evolutionary algorithm is used to evolve ANN’S architectures and/or weights Although there are many studies on how to evolve ANNs most effectively and efficiently [l; 2; 31, no work has been reported on how to generate the final result from the evo- lutionary process. The common practice is to choose the best individual in the last generation as the final result. While this is fine in evolutionary optimisation, it is cer- tainly worth investigating in evolutionary learning where optimality is not clearly defined. The question we are most interested in is: Does the best individual in the last gener- ation contain all the useful information in the whole popu- lation? The current practice implies that the answer is yes. However, learning is different from optimisation although the former is often implemented in a computer as the lat- ter. For example, backpropagation (BP) is in essence a gradient-based optimisation algorithm and is used to learn weights of ANNs. This does not mean that an ANN with the minimum mean square error (trained by BP) will be the optimally learned
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 07/08/2011 for the course CS 101 taught by Professor Khliu during the Spring '11 term at Xiamen University.

Page1 / 6

Yao, Liu_1996_Ensemble structure of evolutionary artificial neural networks

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online