This preview shows page 1. Sign up to view the full content.
Unformatted text preview: of back-propagation on this same neural net (which is shown again
on the next page and on the tear-off sheet) with the following parameters:
weights are initially 1, except
•x1 = x2 = 1
•y* = 1
•r = 1
•All that wC = -0.5 Part B1 (5 points)
What is the output y of the neural net before back-propagation? Part B2 (10 points) one step Run
of back-propagation (keeping in mind that the nodes are adders and not
sigmoids). What are the new weights on the edges? For partial credit, you should show your work
for each weight, unless that weight is unchanged.
wAC' = wBC' = wC' = w1A' = w2B' = wA' = wB' = 9 Part B3 (5 points)
What is the output y of the neural net after one step of back-propagation? 10 Part B4 (6 points)
Lyla trains her adder neural net on data in which all samples occupy a square and the negative
samples seem to always occupy the lower left quadrant of the square. The following is
representative: The neural net is supposed to output a positive value for the data points marked +, and a...
View Full Document
- Fall '10
- Artificial Intelligence