lecture05

lecture05 - Suggested Reading Rifkin. Everything Old Is New...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture 5: Support Vector Machines for Classification Ryan Rifkin Description We derive SVMs from a geometric perspective as well as the regularization perspective. Optimality and duality is introduced to demonstrate how large SVMs can be solved. A comparison is made between SVMs and RLSC. We introduce Regularized Least Squares regression and classification.
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Suggested Reading Rifkin. Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning. MIT Ph.D. Thesis, 2002. < Evgeniou, Pontil and Poggio. Regularization Networks and Support Vector Machines Advances in Computational Mathematics, 2000. V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995....
View Full Document

This note was uploaded on 11/11/2011 for the course BIO 9.07 taught by Professor Ruthrosenholtz during the Spring '04 term at MIT.

Ask a homework question - tutors are online