{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Finance Notes_Part_17

# Finance Notes_Part_17 - Simplex gradients It is possible to...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Simplex gradients It is possible to build a simplex gradient: y y 1 y 2 ∇ s f ( y ) = S-> δ ( f ) . Audet and Vicente (SIOPT 2008) Unconstrained optimization 46/109 Simplex gradients Simple Fact (What is a simplex gradient) The simplex gradient (based on n + 1 affinely independent points) is the gradient of the corresponding linear interpolation model: f ( y ) + h∇ s f ( y ) , y i- y i = f ( y ) + ( S-> δ ( f ) ) > ( Se i ) = f ( y ) + δ ( f ) i = f ( y i ) .-→ Simplex derivatives are the derivatives of the polynomial models . Audet and Vicente (SIOPT 2008) Unconstrained optimization 47/109 Line-search methods For instance, the implicit filtering method : Computes a simplex gradient (per iteration).-→ The function evaluations can be computed in parallel .-→ It can use regression with more than n + 1 points. Improves the simplex gradient by applying a quasi-Newton update. Performs a line search along the negative computed direction....
View Full Document

{[ snackBarMessage ]}

### Page1 / 5

Finance Notes_Part_17 - Simplex gradients It is possible to...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online