hw03 - first two training iterations of the BACKPROPAGATION...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
CS464 Introduction to Machine Learning Fall 2009 Homework 3 – Nueral Networks Due Date: November 18, 2009 Q1) a) Design a two-input perceptron that implements the boolean function A ¬ B. b) Give the trace of the perceptron learning algorithm for this function. Assume that all possible input combinations are given as training examples, the learning rate is 0.2, and the initial values of weights are 0.1. Q2) Derive a gradient descent training rule for a single unit with output o , where Q3) Consider a two-layer feedforward ANN with two inputs a and b, one hidden unit c, and one output unit d. This network has five weights (w ca , w cb , w c0 , w db , w d0 ), where w x0 represents the threshold weight for unit x. Initialize these weights to the values (.1, .l, .l, .l, .I), then give their values after each of the
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: first two training iterations of the BACKPROPAGATION Algorithm. Assume learning rate is .3, momentum is 0.9, incremental weight updates, and the following training examples: a b d 1 0 1 0 1 Not: Q3 (backpropagation algorithm) subject is not included in midterm. You should hand-in: Give the hard copy (type-written) of your homework on due date in the class. Send the soft copy (word document, or pdf) of your homework to your assistant. Make sure that your homework contains your name, and the subject field of your email is Homework3 (email to your assistant (Mcahid Kutlu) mucahid@cs.bilkent.edu.tr ). DO IT YOURSELF CHEATING WILL BE PUNISHED...
View Full Document

This note was uploaded on 12/27/2009 for the course CS 464 taught by Professor Demir during the Fall '08 term at Bilkent University.

Ask a homework question - tutors are online