This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: CSCI 1950-F: Introduction to Machine Learning Erik Sudderth and Mark Johnson, Fall 2009 How can artificial systems learn from examples, and discover information buried in massive datasets? This course explores the theory and practice of statistical machine learning. Topics include param- eter estimation, probabilistic graphical models, approximate inference, and kernel and nonpara- metric methods. Applications to regression, categorization, and clustering problems are illustrated by examples from vision, language, communications, and bioinformatics. Prerequisites: CSCI0160, CSCI0180 or CSCI0190, and comfort with basic probability, linear algebra, and calculus. Introduction The main goal of this class is to introduce you to the ideas and techniques of machine learning, and the probabilistic models that underlie behind them. These ideas have their origins in work by sta- tisticans such as Laplace and Bayes several centuries ago. However, modern computing techniques now permit us to apply these to problems of a size and diversity that was barely conceivable only a few decades ago. The kinds of problems well discuss involve prediction of one kind or another. Classification problems involve predicting a discrete value from a finite set of choices, while regression problems involve predicting a continuous value. Supervised learning techniques can be used to design such...
View Full Document
This note was uploaded on 11/03/2009 for the course CS 195f taught by Professor Johnson during the Spring '09 term at Sanford-Brown Institute.
- Spring '09
- Machine Learning