Neural Networks, CAP 6615
Program 4, Due 22 March 2011
This assignment asks you to look at radial basis function networks by first tackling the three data sets of
Program 4 with newrb and newrbe, then with a semisupervised learning system using Laplacian

Neural Networks, CAP 6615
Program 3, Due 21 February 2011
Your job is to implement a multilayer perceptron (MLP) in Matlab and test its performance on
three data sets provided along with this assignment.
The MLP will be implemented with two functions:
M =

Input
is a scalar value, namely the desired output value associated with input ,
is a learning rate value, and
where
is a column vector containing the m weights calculated by the best meansquare error fit.

Input
is a column vector containing N desired output values selected from cfw_-1,1,
is a learning rate value,
(optional arguments, i.e., either neither are provided or
both are provided) if provided, specify the range of x and y values over which to displ

Neural Networks, CAP 6615 Spring 2011
Homework 7, Due 18 April 2011
1. (Haykin Problem 11.3) Consider the Markov chain depicted in Figure P11.3, which is reducible.
Identify the classes of states contained in this state transition diagram.
2/3
1/3
x1
3/4

Neural Networks, CAP 6615 Spring 2011
Homework 7, Due 18 April 2011
1. (Haykin Problem 11.3) Consider the Markov chain depicted in Figure P11.3, which is reducible.
Identify the classes of states contained in this state transition diagram.
2/3
1/3
x1
3/4

Neural Networks CAP 6615, Spring 2011
Homework 6 Solutions
1. (Problem 8.15 Haykin) Let k ij denote the centered counterpart of the ij-th element k ij of the
Gram K. Derive the following formula (Schlkopf, 1977):
N
N
N
N
1
1
1
k ij = k ij
T ( x m) (x j )

Neural Networks
Homework 6, Due 6 April 2011
1. (Problem 8.15 Haykin) Let k ij denote the centered counterpart of the ij-th element k ij of the
Gram K. Derive the following formula (Schlkopf, 1977):
N
N
N
N
1
1
1
k ij = k ij
T ( x m) (x j ) N T ( xi ) (

Neural Networks, CAP 6615
Homework 5, Due 18 March 2011
1. (Haykin Problem 6.22) Equations (6.77), (6.78), and (6.79) describe three important properties of
the inner product f , g defined in Eq. (6.75). Prove the properties described in those three
equat

Neural Networks, CAP 6615
Homework 5, Due 18 March 2011
1. (Haykin Problem 6.22) Equations (6.77), (6.78), and (6.79) describe three important properties of
the inner product f , g defined in Eq. (6.75). Prove the properties described in those three
equat

Neural Networks
1.
Homework 4, Due 24 February 2010
(Problem 5.3 from Haykin) The example given in Fig. 5.1b depicts a spherically separable
dichotomy. Assume that the four data points outside the separating surface lie on a circle and that the
only data

Neural Networks
Homework 3, Due 2 February 2011
1. On page 148-149, Haykin discusses how to select initial weights. He notes that LeCun suggested
the standard deviation of the weights for all syanpses coming into a particular node j, namely
1 /2
where m j

Neural Networks
Homework 2, Due 21 January 2011
1. (Haykin 3/e problem 1.3)
a. The perceptron may be used to perform numerous logic function. Demonstrate the
implementation of the binary logic function AND, OR, and COMPLEMENT. (Assume the input
to AND and

Neural Networks
Homework 1, January 2011
1
av .
1 e
d
= a v [ 1 v ] .
Show that
dv
1. Let v =
av
av
d
1
ae
1
1 e 1
1
1
av
=
e a =
= a
= a
1
a v 2
av 2
av
av
av
av
dv 1 e
1 e
1 e
1 e
1 e
1 e
= a v 1 v
What is the value of this expression at the or

Neural Networks, CAP 6615
Program 5, Due 20 April 2011
In this assignment, you'll perform handwritten character classification.
The underlying data are drawn from the CEDAR dataset which was prepared for the U.S. Postal service
using images of actual hand