HW4_ML_Gang_LIU - CS 6375 Machine Learning Spring 2009 HW4...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS 6375 Machine Learning, Spring 2009 HW4 Gang LIU SID:11458407 Mar. 05, 2009 Email: [email protected] Solution: We define the sum of squared errors as follows: 2 1 ( ( )) 2 examples E t g z =- ∑ 2 :-( ( )) '( ) , '( ) 2 2 ( ) ( ( )) 2 ( ) j Z j j j E E t g z g z W z x g z ze zg z x E t g z zg z x W β- ∂ ∇ =- ∂ = • = - = - ∂ ∴ =- • ∂ Q By modifying the above algorithm, actually, the algorithm needs no change, we only need to embody the last line with the following line: ( ( )) 2 ( ) j j j W W t g z zg z x z x η β ¬- ×- • = • 2. Neural network and back propagation [15 pts] For the following network with initial weights, show the weights after an example ((1,1), 1) is presented. Assume learning rate is 0.05, and no momentum. Sigmoid function is used in the nodes. Solution: I am kind of lazy with the digit manipulation, I am also lazy with programming, so I combine them together: basically I write down all the necessary formula without simplify it to concise code. I just show the computation process: Since MATLAB can not express traditional formula, so I mainly resort to underscore “_”to embody the idea of connection....
View Full Document

{[ snackBarMessage ]}

Page1 / 5

HW4_ML_Gang_LIU - CS 6375 Machine Learning Spring 2009 HW4...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online