tut2 - Lateral inhibitions are realized by the lateral...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
EE4210 Tutorial 2 1. Hebbian Learning A constant input signal of x =1.2 is applied repeatedly to a synaptic connection whose initial weight w (0)=1. Assume that the neuronal activation function is linear, i.e., v v = ) ( ϕ . Calculate the synaptic weight w ( n ) at time n = 1, 2, 3, …, k , … using the following rules: (a) Simple form of Hebb’s rule (activity product rule), assuming that the learning-rate parameter η = 0.75. (b) Modified form of Hebb’s rule (generalized activity product rule) for the following 2 cases: (i) η = 0.75, forgetting term parameter a = 2 (ii) = 0.75, a = 0.75 (c) Differential Hebbian learning rule Type I, i.e., only signal velocity is counted, assuming that = 0.75. 2. Competitive Learning The neural network shown in Fig. 1 demonstrates the simplest form of competitive learning.
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lateral inhibitions are realized by the lateral connections from other neurons in the same layer. x i = i th input signal w ji = synaptic weight from input i to neuron j v j = net activity level of neuron j y j = ϕ ( v j ) Fig. 1 Initial output y = [ y 1 y 2 y 3 ] = [0 0 0] Initial weight = 25 . 25 . 25 . 25 . 5 . 5 . 25 . 75 . W y j = ( v j )= + neurons other for neuron winning for 1 = 0.6 Find the synaptic weights after two iterations for the following 2 cases. (a) Initial input x = [ x 1 x 2 x 3 x 4 ] = [1 0 0 0] , (b) Initial input x = [ x 1 x 2 x 3 x 4 ] = [0.4 0.1 0.4 0.1] w 34 w 14 w 24 x 4 x 3 x 2 x 1 w 31 w 21 w 11 y 1 y 2 ν 2 1 y 3 3...
View Full Document

This note was uploaded on 04/14/2011 for the course EE 4210 taught by Professor Wong during the Spring '10 term at City University of Hong Kong.

Ask a homework question - tutors are online