Hopfield_Chapter

Hopfield_Chapter - 13 The Hopeld Model One of the...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
13 The Hopfield Model One of the milestones for the current renaissance in the field of neural networks was the associative model proposed by Hopfield at the beginning of the 1980s. Hopfield’s approach illustrates the way theoretical physicists like to think about ensembles of computing units. No synchronization is required, each unit behaving as a kind of elementary system in complex interaction with the rest of the ensemble. An energy function must be introduced to harness the theoretical complexities posed by such an approach. The next two sections deal with the structure of Hopfield networks. We then proceed to show that the model converges to a stable state and that two kinds of learning rules can be used to find appropriate network weights. 13.1 Synchronous and asynchronous networks A relevant issue for the correct design of recurrent neural networks is the ad- equate synchronization of the computing elements. In the case of McCulloch- Pitts networks we solved this difficulty by assuming that the activation of each computing element consumes a unit of time. The network is built taking this delay into account and by arranging the elements and their connections in the necessary pattern. When the arrangement becomes too contrived, additional units can be included which serve as delay elements. What happens when this assumption is lifted, that is, when the synchronization of the computing elements is eliminated? 13.1.1 Recursive networks with stochastic dynamics We discussed the design and operation of associative networks in the previous chapter. The synchronization of the output was achieved by requiring that all computing elements evaluate their inputs and compute their output simulta- neously. Under this assumption the operation of the associative memory can R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
338 13 The Hopfield Model be described with simple linear algebraic methods. The excitation of the out- put units is computed using vector-matrix multiplication and evaluating the sign function at each node. The methods we have used before to avoid dealing explicitly with the synchronization problem have the disadvantage, from the point of view of both biology and physics, that global information is needed, namely a global time. Whereas in conventional computers synchronization of the digital building blocks is achieved using a clock signal, there is no such global clock in biological systems. In a more biologically oriented simulation, global synchronization should thus be avoided. In this chapter we deal with the problem of identifying the properties of neural networks lacking global synchronization. Networks in which the computing units are activated at different times and which provide a computation after a variable amount of time are stochas- tic automata. Networks built from this kind of units behave like stochastic dynamical systems .
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 10/24/2011.

Page1 / 36

Hopfield_Chapter - 13 The Hopeld Model One of the...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online