A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of

A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: inria-00149181, version 1 - 24 May 2007 A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks Benoˆ ıt Siri, 1 Hugues Berry, 1, ∗ Bruno Cessac, 2, 3 Bruno Delord, 4 and Mathias Quoy 5 1 Team Alchemy, INRIA, Parc Club Orsay Universit´ e, 4 rue J Monod, 91893 Orsay Cedex - France 2 Institut Non Lin´ eaire de Nice, UMR 6618 CNRS-Universit´ e de Nice, 1361 route des Lucioles, 06560 Valbonne, France 3 Team Odyssee, INRIA, 2004 Route des Lucioles, 06902 Sophia Antipolis, France 4 ANIM, U742 INSERM - Universit´ e P.M. Curie, 9 quai Saint-Bernard, 75005 Paris, France 5 ETIS, UMR 8051 CNRS-Universit´ e de Cergy-Pontoise-ENSEA, 6 avenue du Ponceau, BP 44, 95014 Cergy-Pontoise Cedex, France The analysis of learning recurrent neural networks is challenging, because neuron activity and learning dynamics are mutually coupled: neuron activity depends on the synaptic weight network, which itself varies non trivially under the influence of neuron activity. Understanding this interwoven evolution demands adapted theoretical tools. In this article, we present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks. Using theoretical tools from dynamical systems and graph theory, we study a generic “Hebb-like” learning rule that can include passive forgetting and different time scales for neuron activity and learning dynamics. We first show that the classical structural statistics from the so-called “complex networks” field (degree distribution, mean-shortest path, clustering index, modularity) do not provide useful insights for the characterization of the coupling between neuron dynamics and network evolution. Instead, this coupling can be analyzed more efficiently by the study of Jacobian matrices, which introduce both a structural and a dynamical point view on the neural network evolution. In this way, we show that “Hebb-like” learning leads to a reduction of the complexity of the dynamics manifested by a systematic decay of the largest Lyapunov exponent. This effect is caused by a contraction of the spectral radius of Jacobian matrices, induced either by passive forgetting or by saturation of the neurons. As a consequence learning drives the system from chaos to a steady state through a sequence of bifurcations. We show that the network sensitivity to the input pattern is maximal at the “edge of chaos”. We also emphasize the role of feedback circuits in the Jacobian matrices and the link to cooperative systems. I. INTRODUCTION The mathematical study of learning effects (or more generally synaptic plasticity) in neural networks is a difficult task because the dynamics of the neurons depends on the synaptic weight network, that itself evolves non trivially under the influence of neuron dynamics. Understanding this mutual coupling between neuron dynamics and network structure (and its effects on the computational efficiency of the neural network) is a key problem in computational...
View Full Document

Page1 / 24

A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online