ECE542
Spring 2013
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine w
ECE542
Spring 2014
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine w
ECE542
Homework #4
Spring 2014
Solutions
y
= W1(2)2 W2(1)2x 2x 1 + b(1)1 + b(2)1
x
x
2x
1x
1. (20) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
)
(
ECE542
Homework #2
Spring 2016
1. Problem 1.3 (modified) Design a two input, one output, one hidden layer with
two neurons neural network that implements the XOR function C1 =cfw_(0,0),(1,1),
C2,=cfw_
ECE542
Homework #2
Solutions
Spring 2014
1. Problem 1.3
It takes two lines to separate the classes
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
12/29/2015
ECE542
Neural Networks
Monday, Wednesday 3:00-4:15
H. J. Trussell
Instructor: H. J. Trussell
Room 2058, EB2
North Carolina State University
Phone: 919-515-5126
Fax: 919-515-5523
email: [email protected]
Lecture 3
ECE542 Spring 2016
1
Table 1.1 Perceptron Convergence Algorithm
Note use of error function e(n) = d(n)-y(n)
y(n) = sgn(w(n)T xn)
d(n) = 1, if x(n) is in C1 and d(n) = -1 if x(n) is in C2
Upd
ECE542-001
Homework #0
Spring 2017
1. We have not covered this problem in class. It is designed to give you a refresher
in the type of mathematics that well be using in the course.
(c ) Plot the linea
ECE542
Homework #4
Spring 2017
1. (10) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
(
)
y W1(2)
x + b(1)
=
W2(1)
+ b(2)
2x 1
1x 1
x2
x 2 2x 1
)
whe
ECE542
Homework #6
Spring 2017
1. 5.30 (Modified) Use MATLAB routine, kmeans, for this problem.
a. Produce 1000 random samples uniformly distributed on the unit square
[0,1]x[0,1]. Find the clusters f
ECE542-001
Homework #1
Spring 2017
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden la
1/10/2016
Knowledge refers to stored information or models used by a person or
machine to interpret, predict, and appropriately respond to the outside world.
A priori Knowledge
used to design the neu
1/20/2016
Regression: Preliminaries (Review from last class)
Lecture 4
ECE542 Spring 2016
1
Lecture 4
ECE542 Spring 2016
2
1
1/20/2016
Maximum a posteriori Estimation
Lecture 4
ECE542 Spring 2016
3
Le
2/9/2016
Adjustment of weights bases on total error
Based on partial derivatives of error with respect to various weights
y ( W3 ( W2 ( W1x b1 ) b 2 ) b3 )
v1
y1
v2
y2
v3
y y3
Spring 2016
ECE54
ECE542
Homework #3
Spring 2016
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. Formulate the problem using
matrix-vector representation and solve the problem using vector derivati
ECE542
Homework #1
Spring 2016
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hid
ECE542-001
Homework #1
Spring 2016
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden la
1/25/2016
Maximum a posteriori (MAP) Estimation
Find the most probable weight (parameter) vector for the
general model
d F ( w, x)
where the data x and the desired output d are known
For the linear mo
2/3/2016
Multilayer Perceptrons
(Neural Networks with at least one hidden layer)
We build on basic perceptrons, LMS and other optimization methods
Basic features of neural networks
1. Each activation
1/27/2016
is a function of the accuracy of the model and the
estimate of the parameters/weights
This can be improved
Spring 2016
ECE542
1
Spring 2016
ECE542
2
1
1/27/2016
Bias
V(w)
Spring 2016
ECE542
2/8/2016
Multilayer Neural Network
Architecture with two hidden layers
( )
()
()
W2 , b 2
W1 , b1
W3 , b 3
y ( W3 ( W2 ( W1x b1 ) b 2 ) b3 )
Need
Spring 2016
Wk
and
b k
cost function
dy
ECE542
2
1
ECE542
Homework #5
Spring 2017
1. Conjugate gradient property:
2. Using the MATLAB neural network toolbox, find neural networks that classify
the halfmoon problem with a separation distance: dist = -5
ECE542
Homework #2
Spring 2017
1. Problem 1.3 (modified) Design a two input, one output, one hidden layer with
two neurons neural network that implements the XOR function C1 =cfw_(0,0),(1,1),
C2,=cfw_
ECE542
Neural Networks
Spring 2017
Monday, Wednesday 3:00-4:15, Room 1228 EB3
Course Description: The course provides the foundation for designing and using neural
networks and other methods of machin
ECE542
Homework #5
Spring 2014
Due Monday 3/24/14
1. Create static data sets to be used for later problems and examples. The sets
consists of two classes.
a) using the halfmoon data create sets of 300
ECE542-001
Homework #1
Spring 2014
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first
ECE542-001
Homework #1
Spring 2014
Due Thursday 1/16/13, 2:20pm
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the
ECE542
Homework #2
Spring 2014
Due Thursday 1/30/14
1. Problem 1.3
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
common variance equal to 1. Thei