ADVANCED MACHINE LEARNING
ADVANCED MACHINE LEARNING
COMS 4772/6772 HOMEWORK #1
PROF. TONY JEBARA
FEBRUARY 25th , 2015 BEFORE CLASS
DUE
Write all code in Matlab. Submit your work via Courseworks. If unable to, contact the TA via email. Make
sure to tar.gz
predict what chord is being played, create JTA tree (for fast parsing, too many combinations of samples and
classes for multiclass classication), structured prediction with music (100ms frames). Parse tree of the song
would be a bonus improvement
ADVANCED
ADVANCED MACHINE LEARNING
CLASS PROJECT
PROF. TONY JEBARA
PRESENTATIONS ON
APR 22, APR 27, APR 29 AND MAY 04 2015
WRITE UP DUE ON
MAY 7th 2015 BY MIDNIGHT
1. These are either individual 1-person projects or 2-person projects. We expect double the amount o
Top-down Image Segmentation with Shape Priors
Ilya Kavalerov
IK 2356@ COLUMBIA . EDU
Abstract
Given a group of images depicting the same object type, we aim to learn segmentation masks
for objects of that type. In particular, we evaluate the efcacy of a p
Top-Down Image Segmentation with Shape Priors
Ilya Kavalerov
Advanced Machine Learning w4772
Columbia University
[email protected]
May 1, 2015
Ilya Kavalerov (w4772)
Final Project
May 1, 2015
1 / 11
The Problem
Cleanly segment an object known to be in t
59
TOPIC 7. CLUSTERING
7.6
Semantics of k-means clustering
Suppose the input data set S Rd is partitioned into k subsets S1 , S2 , . . . , Sk ; associate each Si
with a representative i Rd . When is a near-optimal solution to k-means clustering on S close
Topic 8:
8.1
Metric embeddings
O(d)
Embedding d into 1
2
Theorem 8.1. Pick any unit vector u d . If A is a random kd matrix whose (i, j)-th entry is
2
Zi,j /( 2/k), where cfw_Zi,j i[k],j[d] are iid N(0, 1) random variables, then for any (0, 1),
P
Au
1
2
Topic 5:
5.1
Principal component analysis
Covariance matrices
Suppose we are interested in a population whose members are represented by vectors in Rd . We
model the population as a probability distribution P over Rd , and let X be a random vector with
di