Overall Layout of MLP
Typically also, each node in a layer (other than the output layer) is
connected to every node in the next layer by a trainable weight.
The overall layout is illustrated in Figure 5.
Figure 5. A multi-layer network
Node Internals
Figu
Controlling Dynamic Physical Systems
Conventional control theory requires a mathematical model to predict the
behaviour of a process so that appropriate control decisions can be made.
Many processes are too complicated to model accurately.
Often, not enou
Classification Tasks
Statistical and connectionist approaches to machine learning are
related to function approximation methods in mathematics.
For the purposes of illustration let us assume that the learning task is
one of classification.
That is, we wis
Decision Trees
A decision tree is a tree in which each branch node represents a choice
between a number of alternatives, and each leaf node represents a
classification or decision.
For example, a decision tree might help a bank decide whether a person
sho
Backprop Specification in tlearn
We must specify the structure of the network, and the training
patterns. This is done, for XOR, using two files:
the .cf (configuration) file and the .data and .teach files.
xor.cf:
NODES:
nodes = 3
inputs = 2
outputs = 1
Generalization
Generalization means performance on unseen input patterns, i.e.
input patterns which were not among the patterns on which the
network was trained.
If you train for too long, you can often get the total sum-squared error
very low, by over-fi
Neural Models of Computation
Biological neurons provide a model for computation (after all, brains
are built from them).
They have inputs (dendrites), outputs (axon) and a response to the
inputs is generated by a process that gives rise to an electrical p
Expected Error Pruning
Approximate expected error assuming that we prune at a particular node.
Approximate backed-up error from children assuming we did not prune.
If expected error is less than backed-up error, prune.
(Static) Expected Error
If we prune
Two Passes of Computation
FORWARD PASS: weights fixed, input signals propagated through network and
outputs calculated. Outputs oj are compared with desired outputs dj; the error
signal ej = dj - oj is computed.
BACKWARD PASS: starts with output layer and
Summary of Splitting Criterion
Some people learn best from an example; others like to see the most general
formulation of an algorithm. If you are an "examples" person, don't let the
following subscript-studded presentation panic you.
Assume there are k c
PropositionalLearningSystems
Rather than searching for discriminant functions, symbolic learning systems find
expressions equivalent to sentences in some form of logic. For example, we may
distinguish objects according to two attributes: size and colour.
Choosing Attributes and ID3
The order in which attributes are chosen determines how complicated the
tree is.
ID3 uses information theory to determine the most informative attribute.
A measure of the information content of a message is the inverse of the
p