1. (35pts) Pre-midterm
a. (4pts) For a neural network, which of the following choices most strongly affects the
trade-off between under-tting and over-tting:
i. The initial weights
ii. The learning rate
iii. The number of hidden node
iv. The choice of the
CS534 Machine Learning - Spring 2013
Final Exam
Name:
You have 110 minutes.
There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to
others and come back to the dicult one later.
If the question asks for explan
C8273a Final Exam
Introduction to Machine Learning: Fall 2013
Thursday December 12th, 2013
Your name: gig/E
Your UCInetID (all caps):
Your Seat (row and number):
0 Total time is 1:50. READ THE EXAM FIRST and organize your time; don't spend too long
on any
C8178 Final Exam
Machine Learning & Data Mining: Winter 2011
Tuesday March 15th, 2011
Upon hook. open mules: mm! time is 1h 50111.
Your name: (I
Name of the person in from. of you (if any):
Name of the person to your right (if any):
I REA D T1113 EXAM l-
CS534 Machine learning Spring 2009
Final Exam
Name:
You have 110 minutes.
There are 11 pages including cover page. Please make sure you are not missing any pages.
Good luck!
Pre-midterm
Ensemble methods
GMM
Learning theory
HAC
k-means
total
1
Max
35
16
Final Exam
COSC 6342 Machine Learning
Solution Sketches
May 8, 2013
Your Name:
Your Student id:
Problem 1 [4]: Information Gain
Problem 2 [14]: Ensemble Methods + Other
Problem 3 [14]: Reinforcement Learning
Problem 4 [11]: Computations in Belief Networks
CS 178 Intro to Machine Learning
Winter 2010
Final Exam
Instructions:
(1) Write the names of your immediate-'ont and immediate-right neighbors (if any):
Person in front:
Person to the right:
(2) READ THE EXAM FIRST and organize your time.
(3) Write your n
Graphical Models
Markov Chains
Time Series
Prof. Alexander Ihler
Graphical models
Complex system made up of many simpler interac?ons:
Model
are sets of variables (func?on arguments)
Factors may be condi?onal
+
Machine Learning and Data Mining
Collaborative Filtering & Recommender Systems
Prof. Alexander Ihler
Recommender systems
Automated recommendations
Inputs
User information
Situation context, demographics, preferences, past ratings
Items
Item charac
+
Machine Learning and Data Mining
Bayes Classifiers
Prof. Alexander Ihler
A basic classier
Training data D=cfw_x(i),y(i), Classier f(x ; D)
Discrete feature vector x
f(x ; D) is a [email protected] table
Ex: credit [email protected] predic
+
Machine Learning and Data Mining
Decision Trees
Prof. Alexander Ihler
Decision trees
Split input into cases
Usually based on a single variable
Recurse down until we reach a decision
Continuous vars: choose split point
1
X1 > .5 ?
0.9
0.8
0.7
0.6
X2