tut4 - Y 1 of the output neuron Y 1 . (ii) Find the...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
EE4210 Tutorial 4 Backpropagation Training in Multilayer Perceptron 1. Suppose that the multilayer perceptron shown in Fig. 1 is trained by the backpropagation training algorithm and all the weights are updated at the same time. The inputs are X 1 =0 and X 2 = 1 while the target output d is 0.9. The learning-rate parameter η is 0.7 and the logistic function ϕ ( ) exp( ) v v = + - 1 1 is chosen as the neuronal activation function. Fig. 1 --- A Multilayer Perceptron (a) In the feedforward phase, find the actual network output y Y 1 and the error. (b) In the error backpropagation phase, do the following computations: (i) Calculate the local gradient δ
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Y 1 of the output neuron Y 1 . (ii) Find the required change Δ w for all the weights and threshold connected to Y 1 . (iii) Calculate the local gradient δ H 1 and δ H2 of the two hidden neurons. (iv) Find the required change Δ w for all the weights and thresholds connected to the two hidden neurons. (v) Update all the weights and thresholds in the multilayer perceptron. 2. Use the new weights and thresholds obtained in Q1 to find the new actual network output y Y 1 for the same inputs X 1 =0 and X 2 = 1. Is it closer to the target value 0.9? X 1 H 1 X 2 H 2 Y 1-0.5-0.4 0.1 0.6-0.7 0.2-1-0.8 0.3-0.6-1-1...
View Full Document

This note was uploaded on 04/14/2011 for the course EE 4210 taught by Professor Wong during the Spring '10 term at City University of Hong Kong.

Ask a homework question - tutors are online