ECE542
Spring 2013
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine what you meant to say. Leaving
out steps may result in n
ECE542
Spring 2013
Final Exam
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine what you meant to say. Leaving
out steps may result in no partial credit. Inclu
ECE542
Spring 2012
Test 2
Instructions: Write all your work on the pages provided. Please include all steps in your
work. If an error is made, it is hard to determine what you meant to say. Leaving out steps
may result in no partial credit. Including extr
12/29/2015
ECE542
Neural Networks
Monday, Wednesday 3:00-4:15
H. J. Trussell
Instructor: H. J. Trussell
Room 2058, EB2
North Carolina State University
Phone: 919-515-5126
Fax: 919-515-5523
email: hjt@ncsu.edu
Office Hours
Monday,Wednesday 9:30-10:30
Other
1/10/2016
Knowledge refers to stored information or models used by a person or
machine to interpret, predict, and appropriately respond to the outside world.
A priori Knowledge
used to design the neural network architecture
preprocess data
Observations/m
Lecture 3
ECE542 Spring 2016
1
Table 1.1 Perceptron Convergence Algorithm
Note use of error function e(n) = d(n)-y(n)
y(n) = sgn(w(n)T xn)
d(n) = 1, if x(n) is in C1 and d(n) = -1 if x(n) is in C2
Update algorithm
w(n+1) = w(n) + (d(n) y(n)xn
Notes on the
2/8/2016
Multilayer Neural Network
Architecture with two hidden layers
( )
()
()
W2 , b 2
W1 , b1
W3 , b 3
y ( W3 ( W2 ( W1x b1 ) b 2 ) b3 )
Need
Spring 2016
Wk
and
b k
cost function
dy
ECE542
2
1
Adjustment of weights bases on total error
Based on pa
1/27/2016
is a function of the accuracy of the model and the
estimate of the parameters/weights
This can be improved
Spring 2016
ECE542
1
Spring 2016
ECE542
2
1
1/27/2016
Bias
V(w)
Spring 2016
ECE542
3
Least Mean Square Algorithm (Chapter 3)
LMS method wa
2/3/2016
Multilayer Perceptrons
(Neural Networks with at least one hidden layer)
We build on basic perceptrons, LMS and other optimization methods
Basic features of neural networks
1. Each activation function is differentiable usually nonlinear
2. Network
1/25/2016
Maximum a posteriori (MAP) Estimation
Find the most probable weight (parameter) vector for the
general model
d F ( w, x)
where the data x and the desired output d are known
For the linear model
d wT x
and Gaussian distributions, the MAP estimat
ECE542-001
Homework #1
Spring 2016
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f ( Wx + b) =
ECE542
Homework #1
Spring 2016
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f ( Wx +
ECE542
Homework #2
Spring 2016
1. Problem 1.3 (modified) Design a two input, one output, one hidden layer with
two neurons neural network that implements the XOR function C1 =cfw_(0,0),(1,1),
C2,=cfw_(1,0),(0,1). There is no need to run the iterative algo
ECE542
Homework #3
Spring 2016
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. Formulate the problem using
matrix-vector representation and solve the problem using vector derivatives.
2. Problem 2.8 (Modified)
a) Repeat the classifica
ECE542
Summer 2012
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on the pages provided. Please include all steps in your
work. If an error is made, it is hard to determine what you meant to say. Leaving out steps
may result in no par
ECE542
Spring 2013
Midterm
Closed Book, Closed Notes
Instructions: Write all your work on only the pages provided. Please include all steps in
your work. If an error is made, it is hard to determine what you meant to say. Leaving
out steps may result in n
ECE542
Homework #3
Spring 2014
Due 2/15/14
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. You may also formulate the
problem using matrix-vector representation and solve the problem using vector
derivatives.
2. Problem 2.8 (Modified)
ECE542
Homework #3
Spring 2013
Solutions
1. Problem 2.2
Note: eq.(2.28) can be used as a starting point also. You may also formulate the
problem using matrix-vector representation and solve the problem using vector
derivatives.
2. Problem 2.8 (Modified)
a
ECE542
Homework #4
Spring 2014
Due 3/1/14
y
= W1(2)2 W2(1)2x 2x 1 + b(1)1 + b(2)1
x
x
2x
1x
1. (20) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
)
(
)
where the matrices and vectors are defined by
4.72
ECE542
Homework #4
Spring 2014
Solutions
y
= W1(2)2 W2(1)2x 2x 1 + b(1)1 + b(2)1
x
x
2x
1x
1. (20) Problem 4.1-2 (modified)
For the architecture of Fig. 4.8, the weights and biases are given by
(
)
(
)
where the matrices and vectors are defined by
4.72 3
ECE542
Homework #2
Solutions
Spring 2014
1. Problem 1.3
It takes two lines to separate the classes
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
common variance equal to 1. Their means are 1 = 6 and 2
ECE542
Homework #2
Spring 2014
Due Thursday 1/30/14
1. Problem 1.3
2. Problem 1.4 Modified
Let two one-dimensional classes, C1 and C2, have Gaussian distributions with
common variance equal to 1. Their means are 1 = 6 and 2 = 20.
(a) Design a perceptron c
ECE542-001
Homework #1
Spring 2014
Due Thursday 1/16/13, 2:20pm
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x
ECE542-001
Homework #1
Spring 2014
Solutions
1. Using the functional notation, f(x), as the activation function for all neurons in the
neural network shown in Fig. 16, page 22, the output of the first hidden layer for
an input vector x is u, given by
f (
ECE542
Homework #5
Spring 2014
Due Monday 3/24/14
1. Create static data sets to be used for later problems and examples. The sets
consists of two classes.
a) using the halfmoon data create sets of 3000 samples (1500 in each class) with
separation dist = -