Neural Net Project 3:
Simple Networks for Function Approximation
(1) Download and unzip Map.zip (Matlab version) and compile the c program.
Familiarize yourself with the code.
(a) Download the file Twod.tra from the webpage. This file has 8 inputs and 7
o

Neural Net Project 2:
Small Linear Networks for Function Approximation
1. Read the Reference Material below.
2. Using the data specified in part C, implement the linear equation solution of part D,
printing out r, c, and w. Implement the steepest descent

MATLAB review problems
The purpose of this assignment is to get familiarize with basic MATLAB commands and script writing.
points are awarded for the programming style as well as logic building. Except question 1, NONE OF
THE PROBLEMS NEED INBUILT MATLAB

III. Polynomial Basis Function Analyses
9/24/2012
Background
In about 1988, researchers asked:
(Q1) Why do neural nets often outperform
approximate Bayes classifiers?
(Q2) Do neural nets have approximation
theorems such as those for power series
and Fouri

IV. Approximation Using Feedforward Nets
9/17/2015
Def: A mapping or approximation network is
one that maps real-valued input vectors to realvalued output vectors.
Comments
(1) Classification nets form a subset of the
approximation or mapping nets.
(2) Th

Homework # 3, EE5353
1. A three-layer MLP has arbitrary connectivity and a form of the simple notation. Let N1(k) denote
the number of units in layer 1 that feed into the kth unit of layer 2 and let n(k,m) denote the index of
the mth input unit that feeds

Homework # 1, EE5353
1. The mean-squared error (MSE) for training a neural network can be written as
1
E=
Nv
Nv
M
[t
p=1 i =1
p
(i) - y p (i)]2
where Nv is the number of training patterns and M is the number of outputs.
(a) If yp(i) is a function of a we

II. Regression in Feedforward Networks
9/23/2015
Read: Gradient Techniques for
Unconstrained Optimization, located at:
http:/www.uta.edu/faculty/manry/ee5353.html
Goals:
(1) Learn notation for data files
(2) Learn structure and notation for feedforward
ne

V. Classification Using Neural Nets
9/17/2015
A. Conventional Classification
Idea: Review conventional classifiers first.
Goals:
(1) Define discriminant function
(2) Optimal Bayes discriminant function
and common forms
(3) Review nearest neighbor classifi

VII. Associative Memories
Goals:
(1) Learn basics of associative memories
(2) Analyze associative memories using tools
developed in previous chapters.
A. Introduction
Interesting Human Trait:
The ability to associate a sensory stimulus
(image, spoken or w

VI. UNSUPERVISED LEARNING, AND
NEURAL NETS THAT USE IT
11/16/2015
A. Introduction
K
N
Nv
N
Dx
Clustering
Nv
N1
Dz
Compression
Def: Unsupervised Learning [1]: Reducing the
N or Nv dimensions of Dx , while preserving
relevant information.
(1) Compression: M