CS295: Convex Optimization
Xiaohui Xie
Department of Computer Science
University of California, Irvine
Course information
Prerequisites: multivariate calculus and linear algebra
Textbook: Convex Optim
Stochastic Subgradient Method
Lingjie Weng, Yutian Chen
Bren School of Information and Computer Science
UC Irvine
Subgradient
Recall basic inequality for convex differentiable :
+ ( )
The gradient
Convex Optimization for
Multitask Feature Learning
Priya Venkateshan
MULTITASK FEATURE LEARNING
MULTITASK FEATURE LEARNING VIA
EFFICIENT L2,1 NORM MINIMIZATION
A probabilistic framework for MTFL
k tas
Color Constancy
Michael Bannister and Jenny Lam
March 3, 2011
The Color Constancy problem
Before
After
Solving the problem
1. nd the color of the illuminant
2. correct the image
Correcting the image
I
Assignment 1
Vinu K Sebastian
Student ID: 11043792
January 28, 2017
1
Part 1
The task required us to improve the supervised text classifier. There were several approaches that were available
to do the
Last Modified: March 5, 2018
CS 295: Statistical NLP: Winter 2018
Homework 4: Neural Machine Translation
Sameer Singh (and Robert L. Logan)
http:/sameersingh.org/courses/statnlp/wi18/
One of the most
About Me
Academic Positions
Recent Assistant Professor at UC Irvine! (2016 -)
Postdoc at University of Washington (2013 -)
PhD from University of Massachusetts, Amherst (2014)
Introduction to the C
Load balancing on a
heterogeneous cluster.
David Carrillo
Daniel Miller
March 8, 2011
Dynamic Load-Balancing on
Heterogeneous Clusters.
An extension of the PhD dissertation work of
John Duselis, under
Beamforming Optimization of MIMO Interference
Network
Feng Jiang
Department of Electrical Engineering and Computer Science
University of California at Irvine
Irvine, CA 92617
[email protected]
Ma
Optimization problems
Optimization problems in standard form
Convex problems in standard form
Some special problems
Optimization problems in standard form
minimize f0 (x)
subject to fi (x) 0, i = 1, ,
Optimality conditions
Optimization problems in standard form
minimize f0 (x)
subject to fi (x) 0,
i = 1, , m
hi (x) = 0,
i = 1, , p
x = (x1 , , xn ) n : optimization variables
f0 : n : objective (or c
Duality
Lagrange dual problem
weak and strong duality
optimality conditions
perturbation and sensitivity analysis
generalized inequalities
Lagrangian
Consider the optimization problem in standard form
Convex functions
Denition
f : R n R is convex if dom f is a convex set and
f (x + (1 )y ) f (x) + (1 )f (y )
for all x, y dom f , and [0, 1].
Convex functions
Denition
f : R n R is convex if dom f is
Unconstrained minimization
Topics
gradient descent method
Newtons method
convergence rate analysis
self-concordant functions
Unconstrained minimization
Problem:
min
xdom f
f (x)
Assumptions:
f is conv
Inequality constrained minimization
minimize
subject to
f0 (x)
fi (x) 0,
i = 1, , m
Ax = b
f is convex, twice continuously dierentiable
A R pn with rank A = p
assume p is nite and attained
assume the
CS295: Convex Optimization
Xiaohui Xie
Department of Computer Science
University of California, Irvine
Convex set
Denition
A set C is called convex if
x, y C = x + (1 )y C
[0, 1]
In other words, a se
Upcoming
Homework
Text Classification 1
Prof. Sameer Singh
Project
CS 295: STATISTICAL NLP
Homework 1 is up!
Next lectures will continue with more details
Sign up for the Kaggle account (@uci.edu emai