{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Support-vector-regression

Support-vector-regression - Support Vector Regression...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Support Vector Regression Interesting history: – Generalized Portrait algorithm (Vapnik and Lerner, 1963, Vapnik and Chervonekis (1963, 1974). Russia. Properties of learning machines which enable them to Properties of learning machines which enable them to generalize well to unseen data – Support Vector Machine. Bell Labs. Vapnik and coworkers (1992 on) Optical character recognition coworkers (1992 on). Optical character recognition and object recognition. – Application to regression since 1997. First part of lecture based on Smola and Scholkopf (2004), see reference material.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Basic idea •We do not care about error below epsilon, but we We do not care about error below epsilon, but we care about magnitude of polynomial coefficients
Background image of page 2
Linear approximation If noise limit is known and function is exactly linear Can define following optimization problem Objective function reflects “flatness” •Problem may not be feasible Problem may not be feasible
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
“Soft” formulation Primal problem Second term in objective corresponds to - insensitive loss function
Background image of page 4
Optimality criteria Lagrangian function Karush-Kuhn-Tucker conditions and
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}