CS295: Convex Optimization
Xiaohui Xie
Department of Computer Science
University of California, Irvine
Course information
Prerequisites: multivariate calculus and linear algebra
Textbook: Convex Optimization by Boyd and Vandenberghe
Course website:
http:/
Convex Optimization for
Multitask Feature Learning
Priya Venkateshan
MULTITASK FEATURE LEARNING
MULTITASK FEATURE LEARNING VIA
EFFICIENT L2,1 NORM MINIMIZATION
A probabilistic framework for MTFL
k tasks, data of type
Data matrix
Linear model:
Weight matri
Stochastic Subgradient Method
Lingjie Weng, Yutian Chen
Bren School of Information and Computer Science
UC Irvine
Subgradient
Recall basic inequality for convex differentiable :
+ ( )
The gradient at determines a global under-estimator of .
What if is
Load balancing on a
heterogeneous cluster.
David Carrillo
Daniel Miller
March 8, 2011
Dynamic Load-Balancing on
Heterogeneous Clusters.
An extension of the PhD dissertation work of
John Duselis, under Isaac Scherson.
Heterogeneous Cluster
Connected Via Ne
Beamforming Optimization of MIMO Interference
Network
Feng Jiang
Department of Electrical Engineering and Computer Science
University of California at Irvine
Irvine, CA 92617
cfw_feng.jiang@uci.edu
March 1st, 2011
Outline
Background
System Model
Problem F
CS295: Convex Optimization
Xiaohui Xie
Department of Computer Science
University of California, Irvine
Convex set
Denition
A set C is called convex if
x, y C = x + (1 )y C
[0, 1]
In other words, a set C is convex if the line segment between any
two point
Inequality constrained minimization
minimize
subject to
f0 (x)
fi (x) 0,
i = 1, , m
Ax = b
f is convex, twice continuously dierentiable
A R pn with rank A = p
assume p is nite and attained
assume the problem is strictly feasible; hence strong duality hold
Equality constrained minimization
equality constrained minimization
eliminating equality constraints
Newtons method with equality constraints
infeasible start Newton method
Equality constrained minimization
minimize f (x)
subject to
Ax = b
f is convex, tw
Unconstrained minimization
Topics
gradient descent method
Newtons method
convergence rate analysis
self-concordant functions
Unconstrained minimization
Problem:
min
xdom f
f (x)
Assumptions:
f is convex, twice continuously dierentiable (means dom f is ope
Convex functions
Denition
f : R n R is convex if dom f is a convex set and
f (x + (1 )y ) f (x) + (1 )f (y )
for all x, y dom f , and [0, 1].
Convex functions
Denition
f : R n R is convex if dom f is a convex set and
f (x + (1 )y ) f (x) + (1 )f (y )
for
Duality
Lagrange dual problem
weak and strong duality
optimality conditions
perturbation and sensitivity analysis
generalized inequalities
Lagrangian
Consider the optimization problem in standard form
min f0 (x)
s. t. fi (x) 0,
i = 1, , m
hi (x) = 0, i =
Optimality conditions
Optimization problems in standard form
minimize f0 (x)
subject to fi (x) 0,
i = 1, , m
hi (x) = 0,
i = 1, , p
x = (x1 , , xn ) n : optimization variables
f0 : n : objective (or cost) function
fi : n : inequality constraint functions
Optimization problems
Optimization problems in standard form
Convex problems in standard form
Some special problems
Optimization problems in standard form
minimize f0 (x)
subject to fi (x) 0, i = 1, , m
hi (x) = 0, i = 1, , p
x = (x1 , , xn ) n : optimiza
Color Constancy
Michael Bannister and Jenny Lam
March 3, 2011
The Color Constancy problem
Before
After
Solving the problem
1. nd the color of the illuminant
2. correct the image
Correcting the image
Induced opponent response model
accurate but complicated