Name: _____________________________
Heuristic Methods for Optimization Final 2010 (Nontheory section)
Read and sign the pledge
before
beginning the exam:
Academic integrity is expected of all students of Cornell University at all times, whether in the
presence or absence of members of the faculty
Understanding this, I declare that I shall not give, use, or receive unauthorized aid in this
examination.
Signature: _________________________________
Show your work so the grader can read and understand what you have done. Clearly state any
assumptions you make and give any equations (if any) you are using.
1.
(25 points)
Selecting algorithms & parameters
Below you are given several different onedimensional functions defined in the domain.
Assume that you want to minimize
COST(S)
and that
S
current
is the current value of
S
in the
search algorithm you are using to find a minimum value.
S
new
is the new value you are
considering in the next iteration.
Assume each algorithm has
NMAX
iterations.
Assume also
that the initial value of
S
o
is selected randomly from a uniform distribution over the allowable
range, but is the same
S
o
for all algorithms you are comparing.
In all three algorithms listed below, the algorithm is
not
prevented from returning to points S for
which it has evaluated the cost function COST(S) in a previous iteration.
Random sampling (RS)
Greedy Stochastic search (GS)
Simulated Annealing (SA)
: The SA parameter to be decided is T
o
(initial temperature), which
you should pick to optimize algorithm performance. Assume that you have restricted T
o
to be between 30 and 5000.
Assume α = 0.99
, and M=1
The neighborhood function for GS and SA is defined as follows
: Pick a random number
∆
s from
a uniform distribution over [25, 25].
Let S
new
= max(1,S
current
+
∆
s) if
∆
s is negative or
min(1000, S
current
+
∆
s) if
∆
s is positive.
Part (a): Continuous problem with S a real number between 1 and 1000
(
The distance between jagged peaks is around 125 on average.
)
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
i.
(9 pts)
Assume you are using T
o
= 1000 for SA, how would you rank the three methods
(from best to worst) for solving this minimization problem accurately with NMAX=100?
Base your judgment on which you think would be the best methods on average.
Explain
the reasons
for your ranking.
The performance of the algorithms will be ranked as follows: 1) Simulated Annealing, 2)
Greedy Search, and 3) Random Sampling. SA will accept uphill moves, so it can get out
of the local minimums and find the global minimum the fastest. GS will get stuck in a
local minimum depending on where it starts, but has a higher probability of finding the
global minimum compared to RS, which jumps around the entire search space looking for
the minimum.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '11
 Shoemaker
 Optimization, pts, Global Minimum

Click to edit the document details