This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Chapter 5 Hopfield Network Chapter 5  Hopfield Network 2 Recurrent Network • Inspired by different ideas from statistical physics . • Characteristics : – abundant use of feedback, – symmetric synaptic connections, – nonlinear computing units • Examples : Hopfield Network / Boltzmann Machine / MeanField Theory (MFT) Machine Chapter 5  Hopfield Network 3 Hopfield Network • Storing information in a dynamically stable configuration. • Unsupervised Learning : store the fundamental memories by locating each pattern (fundamental memory) at the bottom of a ‘ valley ’ in the energy landscape. • Permitting a dynamical procedure to minimize the energy of the network so that the valley becomes a basin of attraction . i.e., to map a fundamental memory ξ μ onto a fixed (stable) point S μ of a dynamic system. ξ μ ⇔ S μ Chapter 5  Hopfield Network 4 Hopfield Network • Retrieval : use the asynchronous dynamical procedure , i.e., updating the state of a neuron selected from those that want to change (being picked randomly and one at a time ). • This procedure repeats until there are no further state changes to report. Chapter 5  Hopfield Network 5 Hopfield Network • Require time to settle to an equilibrium state Hopfield network is a relaxation network with a local learning rule. • can retrieve a stored pattern even if the input pattern is incomplete or has error as a nonlinear associate memory, or contentaddressable memory (CAM). Chapter 5  Hopfield Network 6 Structure of Hopfield Network Chapter 5  Hopfield Network 7 Structure of Hopfield Network • Consists of N neurons, where N is the dimension of each fundamental memory. • Output of each neuron is fed back to the input of all other neurons, via a unit delay element. • No selffeedback ( w ii =0) • Weight matrix W is symmetric , i.e., W T = W and w ji = w ij . influence of neuron j on neuron i = influence of neuron i on neuron j. Chapter 5  Hopfield Network 8 Operations x i (state of neuron i )= +1 if v i > 0 =  1 if v i < 0 i.e., x i = sgn [ v i ] , the signum function • if v i =0 neuron i remains in its previous state. • 2 phases : storage phase & retrieval phase j N i i ji j x w v θ = ∑ = 1 j =1,2, … , N , and i ≠ j Chapter 5  Hopfield Network 9 Storage Phase • To store a set of Ndimensional vectors (the fundamental memories) { ξ μ  μ = 1,2, ..., p }. • Use outer product rule of storage (generalized Hebb ’ s postulate of learning) ) ( , 1 1 , 1 , 1 , feedback self no j i when vector input of element jth the is or j where ji j p i ji w N w => = + = = = ∑ = μ ξ μ ξ μ μ μ ξ ξ Chapter 5  Hopfield Network 10 Storage Phase • Let W be an N x N synaptic weight matrix....
View
Full
Document
This note was uploaded on 04/13/2011 for the course EE 4210 taught by Professor Wong during the Spring '10 term at City University of Hong Kong.
 Spring '10
 wong

Click to edit the document details