Lecture 1: Optimization Models
Goal:
Mathematical modeling. Standard formulation of optimization problems. Feasible set.
1
Optimization
The general procedure to solve a practical problem
Problem
Mathematical Modeling
variables
objective
constraints
Alg
Amanda Nguyen
Department of Economics
UCLA
Economics 103
Introduction to Econometrics
Summer 2013
Problem Set 2 - Due Tuesday July 30
From textbook: (all data sets can be found linked to from the class website)
7.10, 7.16 (skip a, e)
10.3 (d, f, g, and h
Math 164
Fall 2015
Homework 1
Due Friday, Oct 2.
1. Let w = (1, 2, 3)T and b = 5. Find the distance between two planes in R3 dened by
wT x + b = 1
and
2
2
2. Find all values of b such that the matrix A = 4 1
b
3. Solve the linear system Ax = b for x 2 R3
Lecture 1: Optimization Models
Goal:
Mathematical modeling. Standard formulation of optimization problems. Feasible set.
1
Optimization
The general procedure to solve a practical problem
Problem
Mathematical
Modeling
variables
objective
constraints
Alg
Math 164: Optimization
One-Dimensional Search Methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
based on Chong-Zak, 4th Ed.
online discussions on piazza.com
Goal of this lecture
Develop methods for solving the one-dimensional probl
Math 164: Optimization
Gradient Methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
some material taken from Chong-Zak, 4th Ed.
online discussions on piazza.com
Main features of gradient methods
They are the most popular methods (in
Math 164: Optimization
Conjugate direction methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed., and the CG paper
by Shewchuk
online discussions on piazza.com
Main features of conjug
Math 164: Optimization
Barzilai-Borwein Method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Main features of the Barzilai-Borwein (BB) method
The BB method was published in a 8-page paper1 in 1988
It
Math 164: Optimization
Linear programming
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
History
The word programming used traditionally by planners t
Math 164: Optimization
Netwons Method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
some material taken from Chong-Zak, 4th Ed.
online discussions on piazza.com
Main features of Newtons method
Uses both first derivatives (gradients) a
Math 164: Optimization
Basics of Optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
based on Chong-Zak, 4th Ed.
online discussions on piazza.com
Goals of this lecture
For a general form
minimize f (x)
subject to x
we study the
Math 164: Optimization
Nonlinear optimization with inequality constraints
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
we discuss how to recognize a
Introduction to Optimization
Major subfields
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Overview
Continuous vs Discrete
Continuous optimization:
convex vs non-convex
unconstrained vs constrained
l
Math 164: Optimization
Algorithms for constrained optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
Coverage
We will learn some algorithms fo
Math 164: Optimization
Krylov subspace, nonlinear CG, and preconditioning
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed., and the CG paper
by Shewchuk
online discussions on piazza.com
Math 164: Optimization
Optimization application examples
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Job assignment problem1
An insurance oce handles three types of work: Information, Policy,
Claims.
Math 164: Optimization
Support vector machine
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Support vector machine (SVM)
Background: to classify a set of data points into two sets.
Examples:
emails: le
Lecture 28: Optimality Conditions for Nonlinear Equality Constraints
Definition 1. The optimality conditions are expressed in terms of the Lagrangian function
L(x, ) = f (x)
m
X
i gi (x) = f (x) T g(x),
i=1
where is a vector of Lagrange multipliers, and
Lecture 25: Lagrange Multiplier and Lagrangian Function
1
Lagrange Multiplier
Definition 1. If x is a local minimizer of f (x) subject to Ax = b, then
ZT f (x ) = 0
f (x ) = AT
where Rm is called Lagrange multiplier. In particular, if A R1n , then R.
Rem
Lecture 8: Basic Solutions
1
Basic Solutions
Definition 1. Consider a linear programming problem in standard form
minimize
z = cT x
subject to Ax = b .
x0
Here x Rn and A Rmn with m n. We assume that A has full rank, i.e., rank(A) = m which
implies that t
Lecture 7: Concavity and Standard Form of Linear Programs
1
Concavity
Given a convex set S, f (x) is convex on S if and only if f (x) is concave on S. Based on this property,
we can decide the concavity of f by deciding the convexity of f .
Example 1. f (
Lecture 27: Optimality Conditions for Linear Inequality Constraints II
1
Sufficient Conditions for Linear Inequality Constraints
Definition 1. The active constraint aTi x bi 0 at x , i.e., aTi x bi = 0, is called
nondegenerate if the associated multiplie
Lecture 4: SVM and Feasibility
1
Support Vector Machines (SVM)
Given a set of parameters about one subject (features) and a set of training points (the points with
known labels), Support Vector Machines (SVM) can be used to classify 2 sets of data with di
Lecture 21: Optimality Conditions
We study the problem
minimize
x
f (x),
where no constraints are placed on the variables x. For example, data fitting problems use the objective
function to measure the difference between the model and the data (least squa
Lecture 12: Simplex Method II
Consider the linear program in standard form
minimize
z = cT x
subject to Ax = b
x 0.
1. Optimality Test. Since
BxB + NxN = b
xB = B1 b B1 NxN ,
=
which is the general formula for xB , we have
z = cT x = cTB xB + cTN xN = cTB
Lecture 14: Degeneracy and Initialization
1
Multiple Solutions
Example 1. Solve the following linear program using the simplex method
minimize
z = x1
subject to
2x1 + x2 2
x1 + x2 3
x1 3
x1 , x2 0.
Solution. We first convert the problem into standard form
Math 164: Introduction to Optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
What is mathematical optimization?
Optimization models the goal of solving a problem in the optimal way.
Examples:
Math 164: Introduction to Optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Resource-constrained revenue optimization
m resources; resource i has bi units available
n products; product j uses
Math 164: Optimization
The Simplex method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
Overview: idea and approach
If a standard-form LP has a solut
Math 164: Optimization
Nonlinear optimization with equality constraints
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
we discuss how to recognize a s
Lecture 19: Complementary Slackness I
Consider the primal linear program in standard form
primal (P)
minimize z = cT x
subject to Ax = b
x0
1
dual (D)
maximize w = bT y
subject to AT y c
y unrestricted.
More Remarks About Strong Duality
For given optimal
Lecture 6: Convex Problems and Convex Functions
1
Global Minimizer of Convex Problems
Theorem 1. If x is a local minimizer of a convex optimization problem, then x is a global minimizer.
Moreover, if the objective function f is strictly convex, then x is
Lecture 23: Optimality Conditions for Linear Equality Constraints
Example 1. Transform the following problem into an unconstrained one
minimize
f (x) = x21 2x1 + x22 x23 + 4x3
subject to x1 x2 + 2x3 = 2.
Solution. We choose a basis matrix Z as the null-sp
Lecture 24: Lagrange Multipliers
Lemma 1 (Sufficient Conditions, Linear Equality Constraints). If x satisfies
Ax = b,
ZT f (x ) = 0,
ZT 2 f (x )Z is positive definite,
where Z is a basis matrix for the null space of A, then x is a strict local minimize
Lecture 22: Optimality Conditions
1
Optimality Conditions for Unconstrained Problems
necessary condition: if x is a local maximizer, then f (x ) = 0 and 2 f (x ) is negative
semidefinite.
sufficient condition: if f (x ) = 0 and 2 f (x ) is negative defi