ECE542
Spring 2014
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine what you meant to say. Leaving
out steps may result in n
ECE542
Homework #4
Spring 2014
Solutions
y
= W1(2)2 W2(1)2x 2x 1 + b(1)1 + b(2)1
x
x
2x
1x
1. (20) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
)
(
)
where the matrices and vectors are defined by
4.72 3
ECE542
Spring 2013
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine what you meant to say. Leaving
out steps may result in n
ECE542
Homework #2
Spring 2016
1. Problem 1.3 (modified) Design a two input, one output, one hidden layer with
two neurons neural network that implements the XOR function C1 =cfw_(0,0),(1,1),
C2,=cfw_(1,0),(0,1). There is no need to run the iterative algo
ECE542
Homework #2
Solutions
Spring 2014
1. Problem 1.3
It takes two lines to separate the classes
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
common variance equal to 1. Their means are 1 = 6 and 2
12/29/2015
ECE542
Neural Networks
Monday, Wednesday 3:00-4:15
H. J. Trussell
Instructor: H. J. Trussell
Room 2058, EB2
North Carolina State University
Phone: 919-515-5126
Fax: 919-515-5523
email: [email protected]
Office Hours
Monday,Wednesday 9:30-10:30
Other
Lecture 3
ECE542 Spring 2016
1
Table 1.1 Perceptron Convergence Algorithm
Note use of error function e(n) = d(n)-y(n)
y(n) = sgn(w(n)T xn)
d(n) = 1, if x(n) is in C1 and d(n) = -1 if x(n) is in C2
Update algorithm
w(n+1) = w(n) + (d(n) y(n)xn
Notes on the
ECE542-001
Homework #0
Spring 2017
1. We have not covered this problem in class. It is designed to give you a refresher
in the type of mathematics that well be using in the course.
(c ) Plot the linear fit and the data points using MATLAB.
2. This problem
ECE542
Homework #4
Spring 2017
1. (10) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
(
)
y W1(2)
x + b(1)
=
W2(1)
+ b(2)
2x 1
1x 1
x2
x 2 2x 1
)
where the matrices and vectors are defined by
4.72 3.51
1
ECE542
Homework #6
Spring 2017
1. 5.30 (Modified) Use MATLAB routine, kmeans, for this problem.
a. Produce 1000 random samples uniformly distributed on the unit square
[0,1]x[0,1]. Find the clusters for K=4 and K=8. Plot the results and indicate
the clust
ECE542-001
Homework #1
Spring 2017
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f ( Wx + b) =
1/10/2016
Knowledge refers to stored information or models used by a person or
machine to interpret, predict, and appropriately respond to the outside world.
A priori Knowledge
used to design the neural network architecture
preprocess data
Observations/m
2/9/2016
Adjustment of weights bases on total error
Based on partial derivatives of error with respect to various weights
y ( W3 ( W2 ( W1x b1 ) b 2 ) b3 )
v1
y1
v2
y2
v3
y y3
Spring 2016
ECE542
1
Adjustment of weights (hidden neuron, contd)
Note b
ECE542
Homework #3
Spring 2016
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. Formulate the problem using
matrix-vector representation and solve the problem using vector derivatives.
2. Problem 2.8 (Modified)
a) Repeat the classifica
ECE542
Homework #1
Spring 2016
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f ( Wx +
ECE542-001
Homework #1
Spring 2016
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f ( Wx + b) =
1/25/2016
Maximum a posteriori (MAP) Estimation
Find the most probable weight (parameter) vector for the
general model
d F ( w, x)
where the data x and the desired output d are known
For the linear model
d wT x
and Gaussian distributions, the MAP estimat
2/3/2016
Multilayer Perceptrons
(Neural Networks with at least one hidden layer)
We build on basic perceptrons, LMS and other optimization methods
Basic features of neural networks
1. Each activation function is differentiable usually nonlinear
2. Network
1/27/2016
is a function of the accuracy of the model and the
estimate of the parameters/weights
This can be improved
Spring 2016
ECE542
1
Spring 2016
ECE542
2
1
1/27/2016
Bias
V(w)
Spring 2016
ECE542
3
Least Mean Square Algorithm (Chapter 3)
LMS method wa
2/8/2016
Multilayer Neural Network
Architecture with two hidden layers
( )
()
()
W2 , b 2
W1 , b1
W3 , b 3
y ( W3 ( W2 ( W1x b1 ) b 2 ) b3 )
Need
Spring 2016
Wk
and
b k
cost function
dy
ECE542
2
1
Adjustment of weights bases on total error
Based on pa
ECE542
Homework #5
Spring 2017
1. Conjugate gradient property:
2. Using the MATLAB neural network toolbox, find neural networks that classify
the halfmoon problem with a separation distance: dist = -5.0, width = 6, radius =
10. Compare three methods: The
ECE542
Homework #2
Spring 2017
1. Problem 1.3 (modified) Design a two input, one output, one hidden layer with
two neurons neural network that implements the XOR function C1 =cfw_(0,0),(1,1),
C2,=cfw_(1,0),(0,1). There is no need to run the iterative algo
ECE542
Neural Networks
Spring 2017
Monday, Wednesday 3:00-4:15, Room 1228 EB3
Course Description: The course provides the foundation for designing and using neural
networks and other methods of machine learning. The approach of the course is to
emphasize
ECE542
Homework #5
Spring 2014
Due Monday 3/24/14
1. Create static data sets to be used for later problems and examples. The sets
consists of two classes.
a) using the halfmoon data create sets of 3000 samples (1500 in each class) with
separation dist = -
ECE542-001
Homework #1
Spring 2014
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f (
ECE542-001
Homework #1
Spring 2014
Due Thursday 1/16/13, 2:20pm
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x
ECE542
Homework #2
Spring 2014
Due Thursday 1/30/14
1. Problem 1.3
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
common variance equal to 1. Their means are 1 = 6 and 2 = 20.
(a) Design a perceptron c
ECE542
Homework #3
Spring 2017
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. Formulate the problem using
matrix-vector representation and solve the problem using vector derivatives.
2. Problem 2.8 (Modified)
a) Repeat the classifica
ECE542
Homework 7
Spring 2017
1. In the last homework, problem 6.3 in the text asked you to derive the dual
Lagrange problem of eq.(6.27) from the primal classification problem stated in
eqs.(6.24-6.26). For this problem, derive the dual problem for the S
Homework submission instructions
1) Make a single file that contains the entire homework results , use PDF format. This
should contain everything. The single file allows me to write comments on it for
feedback. Typing will take me too long for most commen