MATH 4070-6070
Solutions for HW Problems 2 (7-9)
7.2a,b,d Let f (x) = x2 + 4 cos x, x R. We wish to nd the minimizer x of f over the interval
[1, 2]. (Calculator users: Note that in cos x, x is in rad).
a. Plot f (x) versus x over the interval [1, 2].
b.
MATH 4070-6070
Solutions for HW problems 4 (12)
12.3 Suppose that we performan experiment to calculate the gravitational constant g as follows.
We drop a ball from a certain height, and measure its distance from the original point at certain
time instants
Chapter 12: Solving Linear Equation
Section 12.1 Least Squares Problems
Least squares: a special class of optimization problems.
Well studied because:
Many applications
Easy to solve
Basic idea: want to solve systems of linear equations
Ax = b
A 2 Rm
n
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Chapter 6: Optimization: Set-Constrained or Unconstrained
Introduction (6.1)
Consider optimization (minimization) problem:
minimize
subject to
f (x)
x2
For convenience, we shall drop of "arr
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Chapter 7: One-Dimensional Search Methods
Consider f : R ! R:
Look for local minimizer of f numerically
Use iterative algorithm to nd minimizer
Start with an initial guess x(0) : Next point
Chapter 11: Quasi-Newton Method
Section 11.1 Introduction
Newton method:
s
Fast convergence if we start close enough to solution.
Requires Hessian inverse (which may be large).
Quasi-Newton methods: approximate the Hessian inverse using only
gradient in
Chapter 15: Linear programming (LP)
Constrained optimization problems
So far, we have considered unconstrained optimization problems:
minimize f (x)
where f : Rn ! R:The solution x can be anything in Rn
We now turn to problems with constraints
General con
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Calculus Preliminaries
Chapter 3 Matrix
Linear transformation L : Rn ! Rm satises (3.1)
L (~ + ~ ) = L (~ ) + L (~ )
x y
x
y
L ( ~ ) = L (~ )
x
x
Any linear transformation L can be repres
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Chapter 8: Gradient Method
Introduction
~
Recall that we shall drop o over head arrow: d(k) instead of d(k)
We now study multi-dimensional search for f : Rn ! R
Recall that the basic strateg
Chapter 9: Newton Method (M-D)
s
Recall that a M-D search includes two steps: (1) nd the search direction d(k) ; and (2) run 1-D search along this direction.
Gradient method uses rf x(k) as the search direction. So it uses
only gradient information (rst d
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Chapter 10: Conjugate Direction Method
Recall that in the Gradient methods, we search minimizer following an
"orthogonal" path: x(k+1) x(k) ? x(k) x(k 1) ; or d(k) ? d(k+1)
While in the modi
MTH 4070-6070 Optimization Techniques
Professor Chao Cheng Huang
Lecture 1: Introduction
Optimization = making the best decision
engineering design
manufacturing process
policy making
business management
portfolio management
and much more.
We assume
MATH 4070-6070
Solutions for HW Problems 3 (10-11)
10.1 Let Q be a real symmetric positive denite n n matrix. Given an arbitrary set of linearly
independent vectors cfw_p(0) , . . . , p(n1) in Rn , the Gram-Schmidt procedure generates a set
of vectors cf
MATH 4070-6070
Solutions for HK Problems 1 (1-6)
1.5 Suppose you are shown four cards, laid out in a row. Each card has a letter on one side and
a number on the other. On the visible side of the cards are printed the symbols:
S
8
3
A
Determine which cards