EE4210 Tutorial 3
Train a Perceptron to perform the logical NOR function
Besides using a logic gate, the logical NOR function with bipolar inputs and output defined by Table 1 can also be implemented
Chapter 7 Fuzzy Sets and Operations
Fuzziness versus Probability
Fuzziness : an alternative to randomness for describing uncertainty. Fuzzy Theory : all things admit degrees (not clear cut), but admit
Chapter 4
Multilayer Perceptron
Multilayer Perceptron
A generalization of the single-layer perceptron to enhance its computational power. Training method : error backpropagation algorithm which is bas
Chapter 7
Fuzzy Sets
and Operations
Fuzziness versus Probability
Fuzziness : an alternative to randomness
for describing uncertainty.
Fuzzy Theory : all things admit degrees (not
clear cut), but admit
Chapter 6
Self-organizing Map
Introduction
Self-organizing feature map: a special class of
ANN based on competitive learning, a kind of
unsupervised learning very similar to human
brain learning proce
Chapter 5
Hopfield Network
Recurrent Network
Inspired by different ideas from statistical
physics.
Characteristics:
abundant use of feedback,
symmetric synaptic connections,
nonlinear computing u
Chapter 4
Multilayer Perceptron
Multilayer Perceptron
A generalization of the single-layer
perceptron to enhance its computational
power.
Training method : error backpropagation
algorithm which is bas
Chapter 3
Perceptron
Perceptron
the simplest form of an ANN, used for the
classification of patterns that are linearly
separable.
Simplest version: a single neuron with
adjustable synaptic weights and
Chapter 2
Learning
Basic Concepts
Learning is a process by which the free
parameters of a neural network are adapted
through a continuing process of stimulation
by the environment in which the network
EE4210
Neural Networks and
Fuzzy Systems
Dr. K.W. Wong
Department of Electronic Engineering
City University of Hong Kong
[email protected]
FYW6320
Ext: 9409
Course Objectives
1.
2.
Introduce the f
Perceptron
Network
Architecture
Multilayer
Perceptron
Feedforward, single Feedforward
layer of neurons
Hopfield
Network
Recurrent
Self-Organizing
Map
Neurons located
in lattice with
lateral connection
EE4210
Tutorial 1
1. General Concepts on Neural Networks and Fuzzy Systems
(a) Compare conventional digital computers and artificial neural networks on the following
aspects: (i) processing unit, (ii)
EE4210
Solution to Tutorial 5
Hopfield Network
1.
W=
1p
p
T N I
N =1
p=3, N=4
1
0 1 3
1 0 1 3
1
=
1
4 3 1 0
0
1 3 1
2.
Condition for stability:
i = sgn(W i )
for i = 1, 2, 3
(i) i = 1
1 1
1
1
EE4210 Tutorial 5
Hopfield Network
A Hopfield network is used to store the following three fundamental memories:
1 = [ 1, 1, 1, 1]T
2 = [ 1, 1, 1, 1]T
3 = [1, 1, 1, 1]T
1. Compute the synaptic weight
EE4210 Tutorial 4
Backpropagation Training in Multilayer Perceptron
1. Suppose that the multilayer perceptron shown in Fig. 1 is trained by the backpropagation
algorithm and all the weights are update
EE4210
Solution to Tutorial 3
A Perceptron for implementing the 4-point logical OR function
1 if v > 0
(a) y = (v ) =
1 if v 0
Use Conventional Perceptron Training Algorithm,
Correctly classified =>
EE4210 Tutorial 3
Train a Perceptron to perform the logical OR function
Besides using a logic gate, the logical OR function with bipolar inputs and output defined by
Table 1 can also be implemented by
EE4210 Tutorial 2
1. Hebbian Learning
A constant input signal of x=1.1 is applied repeatedly to a synaptic connection whose initial
weight w(0)=1. Assume that the neuronal activation function is linea
EE4210
Solution to Tutorial 1
1. General Concepts on Neural Networks and Fuzzy Systems
(a)
Conventional digital computers
Artificial neural networks
Processing unit
One or only a few complicated centr
EE4210 Tutorial 6
Learning in a 1-D Self-Organizing Map
The 1-D self-organizing map (SOM) shown in Fig. 1 has 4 neurons and 3 inputs.
xi = i th input signal
wji = synaptic weight from input i to neuro
EE4210 Tutorial 9
Fuzzy Associative Memory (FAM)
Consider the control of an inverted pendulum with two fuzzy state variables and one fuzzy control variable. state variable : the angle that the pendulu
EE4210 Tutorial 2 1. Hebbian Learning
A constant input signal of x=1.2 is applied repeatedly to a synaptic connection whose initial weight w(0)=1. Assume that the neuronal activation function is linea
EE4210 Tutorial 1 1. General Concepts on Neural Networks and Fuzzy Systems
(a) Compare conventional digital computers and artificial neural networks on the following aspects: (i) processing unit, (ii)
Chapter 6
Self-organizing Map
Introduction
Self-organizing feature map: a special class of ANN based on competitive learning, a kind of unsupervised learning very similar to human brain learning proce
Chapter 5
Hopfield Network
Recurrent Network
Inspired by different ideas from statistical physics. Characteristics:
abundant use of feedback, symmetric synaptic connections, nonlinear computing unit
Chapter 3
Perceptron
Perceptron
the simplest form of an ANN, used for the classification of patterns that are linearly separable. Simplest version: a single neuron with adjustable synaptic weights and