EE4210 Tutorial 3
Train a Perceptron to perform the logical NOR function
Besides using a logic gate, the logical NOR function with bipolar inputs and output defined by Table 1 can also be implemented by training a perceptron with threshold (as shown in Fi
Chapter 7 Fuzzy Sets and Operations
Fuzziness versus Probability
Fuzziness : an alternative to randomness for describing uncertainty. Fuzzy Theory : all things admit degrees (not clear cut), but admit them deterministically. Fuzziness & Randomness concept
Chapter 4
Multilayer Perceptron
Multilayer Perceptron
A generalization of the single-layer perceptron to enhance its computational power. Training method : error backpropagation algorithm which is based on the errorcorrection learning rule. Requirement :
Chapter 7
Fuzzy Sets
and Operations
Fuzziness versus Probability
Fuzziness : an alternative to randomness
for describing uncertainty.
Fuzzy Theory : all things admit degrees (not
clear cut), but admit them deterministically.
Fuzziness
&
Randomness
concept
Chapter 6
Self-organizing Map
Introduction
Self-organizing feature map: a special class of
ANN based on competitive learning, a kind of
unsupervised learning very similar to human
brain learning process.
The output neurons compete among
themselves, only o
Chapter 4
Multilayer Perceptron
Multilayer Perceptron
A generalization of the single-layer
perceptron to enhance its computational
power.
Training method : error backpropagation
algorithm which is based on the errorcorrection learning rule.
Requirement
:
Chapter 3
Perceptron
Perceptron
the simplest form of an ANN, used for the
classification of patterns that are linearly
separable.
Simplest version: a single neuron with
adjustable synaptic weights and threshold.
After training, a decision surface in the f
Chapter 2
Learning
Basic Concepts
Learning is a process by which the free
parameters of a neural network are adapted
through a continuing process of stimulation
by the environment in which the network is
embedded.
In ANN, learning is the modifications of
EE4210
Neural Networks and
Fuzzy Systems
Dr. K.W. Wong
Department of Electronic Engineering
City University of Hong Kong
[email protected]
FYW6320
Ext: 9409
Course Objectives
1.
2.
Introduce the fundamental theories of
artificial neural networks (ANN)
Perceptron
Network
Architecture
Multilayer
Perceptron
Feedforward, single Feedforward
layer of neurons
Hopfield
Network
Recurrent
Self-Organizing
Map
Neurons located
in lattice with
lateral connection
Unsupervised
Learning
Learning Type Supervised
Learnin
EE4210
Tutorial 1
1. General Concepts on Neural Networks and Fuzzy Systems
(a) Compare conventional digital computers and artificial neural networks on the following
aspects: (i) processing unit, (ii) memory storage, and (iii) mode of processing.
(b) What
EE4210 Tutorial 5
Hopfield Network
A Hopfield network is used to store the following three fundamental memories:
1 = [ 1, 1, 1, 1]T
2 = [ 1, 1, 1, 1]T
3 = [1, 1, 1, 1]T
1. Compute the synaptic weight matrix W of the network.
2. For asynchronous updating,
EE4210 Tutorial 4
Backpropagation Training in Multilayer Perceptron
1. Suppose that the multilayer perceptron shown in Fig. 1 is trained by the backpropagation
algorithm and all the weights are updated at the same time. The inputs are X1 = 1 and X2 = 1
wh
EE4210
Solution to Tutorial 3
A Perceptron for implementing the 4-point logical OR function
1 if v > 0
(a) y = (v ) =
1 if v 0
Use Conventional Perceptron Training Algorithm,
Correctly classified => wi = 0
for i = 1, 2
Wrongly classified => wi = d xi =
EE4210 Tutorial 3
Train a Perceptron to perform the logical OR function
Besides using a logic gate, the logical OR function with bipolar inputs and output defined by
Table 1 can also be implemented by training a perceptron with threshold (as shown in Fig.
EE4210 Tutorial 2
1. Hebbian Learning
A constant input signal of x=1.1 is applied repeatedly to a synaptic connection whose initial
weight w(0)=1. Assume that the neuronal activation function is linear, i.e., (v) = v .
Calculate the synaptic weight w(n) a
EE4210
Solution to Tutorial 1
1. General Concepts on Neural Networks and Fuzzy Systems
(a)
Conventional digital computers
Artificial neural networks
Processing unit
One or only a few complicated central A vast amount of simple
processing unit (CPU)
proces
EE4210 Tutorial 6
Learning in a 1-D Self-Organizing Map
The 1-D self-organizing map (SOM) shown in Fig. 1 has 4 neurons and 3 inputs.
xi = i th input signal
wji = synaptic weight from input i to neuron j
vj = net activity level of neuron j
yj = output of
EE4210 Tutorial 7 Learning in a 1-D Self-Organizing Map
Consider the 1-D self-organizing map (SOM) shown in Fig. 1.
x1 w41 x2 w11 w21
1
y1
2
x3 w14 x4
y2
xi = i th input signal wji = synaptic weight from input i to neuron j vj = net activity level of neur
EE4210 Tutorial 9
Fuzzy Associative Memory (FAM)
Consider the control of an inverted pendulum with two fuzzy state variables and one fuzzy control variable. state variable : the angle that the pendulum shaft makes with the vertical axis. state variable :
EE4210 Tutorial 2 1. Hebbian Learning
A constant input signal of x=1.2 is applied repeatedly to a synaptic connection whose initial weight w(0)=1. Assume that the neuronal activation function is linear, i.e., (v) = v . Calculate the synaptic weight w(n) a
EE4210 Tutorial 1 1. General Concepts on Neural Networks and Fuzzy Systems
(a) Compare conventional digital computers and artificial neural networks on the following aspects: (i) processing unit, (ii) memory storage, and (iii) mode of processing. (b) Besi
Perceptron Network Architecture
Multilayer Perceptron
Hopfield Network Recurrent
Self-Organizing Map Neurons located in lattice with lateral connection Unsupervised Learning Competitive Learning (P.6.19)
Feedforward, single Feedforward layer of neurons Su
Chapter 6
Self-organizing Map
Introduction
Self-organizing feature map: a special class of ANN based on competitive learning, a kind of unsupervised learning very similar to human brain learning process. The output neurons compete among themselves, only o
Chapter 5
Hopfield Network
Recurrent Network
Inspired by different ideas from statistical physics. Characteristics:
abundant use of feedback, symmetric synaptic connections, nonlinear computing units
Examples: Hopfield Network / Boltzmann Machine / Mea
Chapter 3
Perceptron
Perceptron
the simplest form of an ANN, used for the classification of patterns that are linearly separable. Simplest version: a single neuron with adjustable synaptic weights and threshold. After training, a decision surface in the f