This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CS446: Pattern Recognition and Machine Learning Fall 2009 Problem Set 5 Handed Out: October 29, 2009 Due: November 5th, 2009 • Feel free to talk to your classmates about the homework. I am more concerned that you learn how to solve the problem than that you demonstrate that you solved it entirely on your own. You should, however, write down your solution yourself. Please try to keep the solution brief and clear. • Please, no handwritten solutions. Be sure your name appears on the top of each page. • Please present your algorithms in both pseudocode and English. That is, give a precise formulation of your algorithm as pseudocode and also explain in one or two concise paragraphs what your algorithm does. Be aware that pseudocode is much simpler and more abstract than real code. Take a look at the textbook pseudocode (e.g. Table 2.5 on page 33) to get an idea about the appropriate level of abstraction. • The homework is due at 4:00 pm on the due date. Email writeup and your code to the TA. Please do NOT hand in a hard copy of your writeup. Please put “ < userid > CS446 hw5 submission” as the subject line of the email when you submit your homework to [email protected] . Put all of your files into a single compressed file with the file name“ < userid >hw5.tgz” . 1. [Gradient Descent  25 points] We have a dataset with l examples. Each example can be represented as ( vectorx,y ), where vectorx is the feature vector and y ∈ {− 1 , 1 }...
View
Full Document
 Fall '08
 ahuja
 Machine Learning, Yi, gradient descent algorithm, L2 hinge loss

Click to edit the document details