This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Simplex gradients It is possible to build a simplex gradient: y y 1 y 2 s f ( y ) = S> ( f ) . Audet and Vicente (SIOPT 2008) Unconstrained optimization 46/109 Simplex gradients Simple Fact (What is a simplex gradient) The simplex gradient (based on n + 1 affinely independent points) is the gradient of the corresponding linear interpolation model: f ( y ) + h s f ( y ) , y i y i = f ( y ) + ( S> ( f ) ) > ( Se i ) = f ( y ) + ( f ) i = f ( y i ) . Simplex derivatives are the derivatives of the polynomial models . Audet and Vicente (SIOPT 2008) Unconstrained optimization 47/109 Linesearch methods For instance, the implicit filtering method : Computes a simplex gradient (per iteration). The function evaluations can be computed in parallel . It can use regression with more than n + 1 points. Improves the simplex gradient by applying a quasiNewton update. Performs a line search along the negative computed direction....
View
Full
Document
This document was uploaded on 10/30/2011 for the course FIN 3403 at University of Florida.
 Spring '06
 Tapley
 Finance

Click to edit the document details