Topic 18
Competitive Networks
1
Competitive Networks
The Hamming network is one of the simplest
example of a competitive network. It was designed
explicitly to solve binary or bipolar pattern
recognition problems.
The neurons in the output layer of the

Topic 17
Continuous Hopfield Networks
1
Hopfield Recurrent
Neural Networks
A network that was highly influential in
bringing about the resurgence of neural
network research in the early 1980s.
Hopfield emphasized practicality, both in the
implementatio

Topic 13 Radial Basis
Function Networks
1
Different Views on the Design of
Supervised Neural Networks
The design of a supervised neural networks
may be pursued in a variety of ways.
Backpropagation algorithm: may be viewed as the
application of a recurs

Topic 14 - Stability
1
Recurrent Networks (1 of 3)
Recurrent networks have feedback connections
from their outputs to their inputs.
Recurrent networks are potentially more
powerful than feedforward networks, since
they are able to recognize and recall t

Topic 14 Associative Learning
1
Objectives
The neural networks we have discussed so far have
all been trained in a supervised manner.
In contrast, this topic introduces a collection of
simple rules that allow unsupervised learning.
These rules give net

Topic 7 - Performance
Surfaces and
Optimum Points
1
Performance Learning
During training, the network parameters
(weights and biases) are adjusted in an effort to
optimize the performance of the network
Two steps in the optimization process:
Find a qua

Topic 10Principal Component
Analysis
Also known as Hotelling transform or
Karhunen-Loeve transformation
1
Motivation for
Principal Component Analysis (1 of 3)
A common problem in statistical pattern
recognition is that of feature selection, or feature
ex

Topic 11
Multilayer Perceptrons
and
Backpropagation Algorithms
1. Illustrate the power of multilayer
perceptrons
2. Training multilayer perceptron
with BP algorithm
1
Feedforward Multilayer Perceptron
R S1 S2 S3 Network
2
Example 1: The XOR Problem
To il

Topic 9
WidrowHoff Learning
1
ADALINE Network (1 of 3)
In 1960, Bernard Widrow and his graduate student
Marcian Hoff introduced
The ADALINE (ADAptive LInear Neuron or ADAptive
LINear Element) network, and
A learning rule which they called the LMS (Lea

Topic 8 Performance
Optimization
1
Basic Optimization Algorithm
Develop algorithms to optimize a performance
index F(x)
Optimize will mean to find the value of x that
minimizes F(x)
xk + 1 = xk + k p k
or
xk = xk + 1 x k = kp k
xk +1
kp k
xk
pk - Sear

Topic 3 - Perceptron
Learning Rule
1
Perceptron
A class of neural networks developed by Frank
Rosenblatt and several other researchers in the
late 1950s.
Rosenblatts key contribution: Introducing a
learning rule for training perceptron networks to
solve

Topic 4 Linear Transformations for
Neural Networks
1
Contents
Vector space and subspace
Matrix transformation and linear transformation
Coordinate systems and transition matrix
Similarity transformation and similar matrix
Diagonalizable matrix
Orthogonall

Topic 4
Supervised Hebbian Learning
1
Hebbs Postulate
When an axon of cell A is near enough to excite a cell
B and repeatedly or persistently takes part in firing it,
some growth process or metabolic change takes place
in one or both cells such that As e

Topic 5 Optimal Linear
Associative Mapping
1
Associative Memory Mapping
Associative recall may in general be defined as a
mapping in which a finite number of input (pattern)
vectors is transformed into a given set of output
vectors.
In the case of unidi

Topic 1 Introduction
to Neural Networks
1
Outline
Motives for Artificial Neural Networks
Biological Neurons
Characteristics of Artificial Neural Networks
Historic Notes
Applications of Neural Computing
Classical AI and Neural Networks
Classification of Ne