383-Fall11-Lec19 - 1 CMPSCI 383 Regression and...

Info iconThis preview shows pages 1–12. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 CMPSCI 383 Nov 15, 2011 Regression and Classifcation with Linear Models 2 Today ʼ s topics • Learning from Examples: brief review • Univariate Linear Regression • Batch gradient descent • Stochastic gradient descent • Multivariate Linear Regression • Regularization • Linear ClassiFers • Perceptron learning rule • Logistic Regression 3 Learning from Examples (supervised learning) 4 Learning from Examples (supervised learning) 5 Learning from Examples (supervised learning) 6 Learning from Examples (supervised learning) 7 Learning from Examples (supervised learning) 8 Learning from Examples (supervised learning) 9 Important issues • Generalization • Overftting • Cross-validation • Holdout cross validation • K-Fold cross validation • Leave-one-out cross-validation • Model selection 10 Recall Notation ( x 1 , y 1 ), ( x 2 , y 2 ), K ( x N , y N ) training set Where each was generated by an unknown function y j y = f ( x ) Discover a function that best approximates the true function h f hypothesis 11 Loss Functions L ( x , y , ˆ y ) = Utility (result of using y given input x ) − Utility (result of using ˆ y given input x ) Suppose the true prediction for input...
View Full Document

This note was uploaded on 11/29/2011 for the course COMPSCI 383 taught by Professor Andrewbarto during the Fall '11 term at UMass (Amherst).

Page1 / 30

383-Fall11-Lec19 - 1 CMPSCI 383 Regression and...

This preview shows document pages 1 - 12. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online