This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS 6375 Machine Learning, Spring 2009 HW4 Gang LIU SID:11458407 Mar. 05, 2009 Email: [email protected] Solution: We define the sum of squared errors as follows: 2 1 ( ( )) 2 examples E t g z = ∑ 2 :( ( )) '( ) , '( ) 2 2 ( ) ( ( )) 2 ( ) j Z j j j E E t g z g z W z x g z ze zg z x E t g z zg z x W β ∂ ∇ = ∂ = • =  =  ∂ ∴ = • ∂ Q By modifying the above algorithm, actually, the algorithm needs no change, we only need to embody the last line with the following line: ( ( )) 2 ( ) j j j W W t g z zg z x z x η β ¬ × • = • 2. Neural network and back propagation [15 pts] For the following network with initial weights, show the weights after an example ((1,1), 1) is presented. Assume learning rate is 0.05, and no momentum. Sigmoid function is used in the nodes. Solution: I am kind of lazy with the digit manipulation, I am also lazy with programming, so I combine them together: basically I write down all the necessary formula without simplify it to concise code. I just show the computation process: Since MATLAB can not express traditional formula, so I mainly resort to underscore “_”to embody the idea of connection....
View
Full Document
 Spring '09
 yangliu
 Machine Learning, neural network, cv error

Click to edit the document details