Math 164
Fall 2015
Homework 1
Due Friday, Oct 2.
1. Let w = (1, 2, 3)T and b = 5. Find the distance between two planes in R3 dened by
wT x + b = 1
and
2
2
2. Find all values of b such that the matrix A = 4 1
b
3. Solve the linear system Ax = b for x 2 R3
Lecture 1: Optimization Models
Goal:
Mathematical modeling. Standard formulation of optimization problems. Feasible set.
1
Optimization
The general procedure to solve a practical problem
Problem
Mathematical Modeling
variables
objective
constraints
Alg
Amanda Nguyen
Department of Economics
UCLA
Economics 103
Introduction to Econometrics
Summer 2013
Problem Set 2 - Due Tuesday July 30
From textbook: (all data sets can be found linked to from the class website)
7.10, 7.16 (skip a, e)
10.3 (d, f, g, and h
Math 164: Optimization
One-Dimensional Search Methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
based on Chong-Zak, 4th Ed.
online discussions on piazza.com
Goal of this lecture
Develop methods for solving the one-dimensional probl
Math 164: Optimization
Gradient Methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
some material taken from Chong-Zak, 4th Ed.
online discussions on piazza.com
Main features of gradient methods
They are the most popular methods (in
Math 164: Optimization
Conjugate direction methods
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed., and the CG paper
by Shewchuk
online discussions on piazza.com
Main features of conjug
Math 164: Optimization
Barzilai-Borwein Method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Main features of the Barzilai-Borwein (BB) method
The BB method was published in a 8-page paper1 in 1988
It
Math 164: Optimization
Linear programming
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
History
The word programming used traditionally by planners t
Lecture 4: SVM and Feasibility
1
Support Vector Machines (SVM)
Given a set of parameters about one subject (features) and a set of training points (the points with
known labels), Support Vector Machines (SVM) can be used to classify 2 sets of data with di
Math 164: Optimization
Netwons Method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
some material taken from Chong-Zak, 4th Ed.
online discussions on piazza.com
Main features of Newtons method
Uses both first derivatives (gradients) a
Math 164: Optimization
Basics of Optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
based on Chong-Zak, 4th Ed.
online discussions on piazza.com
Goals of this lecture
For a general form
minimize f (x)
subject to x
we study the
Math 164: Optimization
Nonlinear optimization with inequality constraints
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
we discuss how to recognize a
Introduction to Optimization
Major subfields
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Overview
Continuous vs Discrete
Continuous optimization:
convex vs non-convex
unconstrained vs constrained
l
Math 164: Optimization
Algorithms for constrained optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
Coverage
We will learn some algorithms fo
Math 164: Optimization
Krylov subspace, nonlinear CG, and preconditioning
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed., and the CG paper
by Shewchuk
online discussions on piazza.com
Math 164: Optimization
The Simplex method
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
material taken from the textbook Chong-Zak, 4th Ed.
online discussions on piazza.com
Overview: idea and approach
If a standard-form LP has a solut
Math 164: Optimization
Optimization application examples
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Job assignment problem1
An insurance oce handles three types of work: Information, Policy,
Claims.
Math 164: Optimization
Support vector machine
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Support vector machine (SVM)
Background: to classify a set of data points into two sets.
Examples:
emails: le
Math 164: Introduction to Optimization
Instructor: Wotao Yin
Department of Mathematics, UCLA
Spring 2015
online discussions on piazza.com
Resource-constrained revenue optimization
m resources; resource i has bi units available
n products; product j uses
Lecture 21: Optimality Conditions
We study the problem
minimize
x
f (x),
where no constraints are placed on the variables x. For example, data fitting problems use the objective
function to measure the difference between the model and the data (least squa
Lecture 12: Simplex Method II
Consider the linear program in standard form
minimize
z = cT x
subject to Ax = b
x 0.
1. Optimality Test. Since
BxB + NxN = b
xB = B1 b B1 NxN ,
=
which is the general formula for xB , we have
z = cT x = cTB xB + cTN xN = cTB
Lecture 14: Degeneracy and Initialization
1
Multiple Solutions
Example 1. Solve the following linear program using the simplex method
minimize
z = x1
subject to
2x1 + x2 2
x1 + x2 3
x1 3
x1 , x2 0.
Solution. We first convert the problem into standard form
Math164/1
Spring2017
INFORMATION
INSTRUCTOR:RobertBrown([email protected])
OFFICE:MathSci6220OFFICEHOURS:Monday34andTuesday111
TEACHINGASSISTANT:HowardHeaton([email protected])
OFFICE:MathSci5343MOFFICEHOURS:
TEXT: E. Chong and S. Zak, An Introduction
1
1. Let f (x) = 3x21 2x1 x2 + 2x22 3x1 3x2 . Given x(0) = [1 1]T , calculate x(1) for
the steepest descent gradient algorithm.
2. Solve the following linear programming problem by the simplex algorithm.
Maximize 2x1 + x2
subject to
x1 + x2 2
3x1 + x2 3
x
Math164/1
Spring2017
INFORMATION
INSTRUCTOR:RobertBrown([email protected])
OFFICE:MathSci6220OFFICEHOURS:Monday34andTuesday111
TEACHINGASSISTANT:HowardHeaton([email protected])
OFFICE:MathSci5343MOFFICEHOURS:
TEXT: E. Chong and S. Zak, An Introduction
'1. Given the linear programming problem
maximize 35171 + 23:2 2
' subject to
' $1+=B2 S 1
21:1332 3 2
(a) Write the problem in standard form. ('0) Write the coeicient matrix A from prt
@-
[M \thurvl [JAM L:
Elrolbl 7 . $3
. ., xi 120 \/
XTMAVQLW *EQ T x1
Discussion Notes for Optimization (MATH 164)
UCLA
Spring 2017
Created by
Howard Heaton
Purpose: This document is a compilation of notes generated for discussion in MATH 164 with reference
credit due to Chong and Zaks Introduction to Optimization text. If
Math 164/1
Spring, 2017
THEORY CHECKLIST 2
Steepest descent algorithm: orthogonal steps; descent property
Optimal step size for f(x) quadratic
Newtons method from the quadratic Taylor polynomial (n variables)
Lagranges Theorem for linear constraints
Ortho