Finance Notes_Part_2

# Finance Notes_Part_2 - (gradient Audet and Vicente(SIOPT...

This preview shows pages 1–5. Sign up to view the full content.

Limitations of derivative-free optimization iteration k x k - x * k 0 1.8284e+000 1 5.9099e-001 2 1.0976e-001 3 5.4283e-003 4 1.4654e-005 5 1.0737e-010 6 1.1102e-016 Newton methods converge quadratically (locally) but require ﬁrst and second order derivatives (gradient and Hessian) . Audet and Vicente (SIOPT 2008) Introduction 8/109

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Limitations of derivative-free optimization iteration k x k - x * k 0 3.0000e+000 1 2.0002e+000 2 6.4656e-001 . . . . . . 6 1.4633e-001 7 4.0389e-002 8 6.7861e-003 9 6.5550e-004 10 1.4943e-005 11 8.3747e-008 12 8.8528e-010 Quasi Newton (secant) methods converge superlinearly (locally) but require ﬁrst order derivatives

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: (gradient) . Audet and Vicente (SIOPT 2008) Introduction 9/109 Limitations of derivative-free optimization In DFO convergence/stopping is typically slow (per function evaluation): Audet and Vicente (SIOPT 2008) Introduction 10/109 Pitfalls The objective function might not be continuous or even well deﬁned: Audet and Vicente (SIOPT 2008) Introduction 11/109 Pitfalls The objective function might not be continuous or even well deﬁned: Audet and Vicente (SIOPT 2008) Introduction 12/109...
View Full Document

## This document was uploaded on 10/30/2011 for the course FIN 3403 at University of Florida.

### Page1 / 5

Finance Notes_Part_2 - (gradient Audet and Vicente(SIOPT...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online