This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CMPSC/MATH 451 Numerical Computations Lecture 36 Nov 14, 2011 Prof. Kamesh Madduri Class Overview • Numerical Differentiation • Finite difference schemes • Richardson Extrapolation • Slides from textbook follow. 2 Numerical Integration Numerical Differentiation Richardson Extrapolation Numerical Differentiation Finite Difference Approximations Automatic Differentiation Numerical Differentiation Differentiation is inherently sensitive, as small perturbations in data can cause large changes in result Differentiation is inverse of integration, which is inherently stable because of its smoothing effect For example, two functions shown below have very similar definite integrals but very different derivatives Michael T. Heath Scientific Computing 47 / 61 Numerical Integration Numerical Differentiation Richardson Extrapolation Numerical Differentiation Finite Difference Approximations Automatic Differentiation Numerical Differentiation, continued To approximate derivative of function whose values are known only at discrete set of points, good approach is to fit some smooth function to given data and then differentiate approximating function If given data are sufficiently smooth, then interpolation may be appropriate, but if data are noisy, then smoothing approximating function, such as least squares spline, is more appropriate < interactive example > Michael T. Heath Scientific Computing 48 / 61 Numerical Integration Numerical Differentiation Richardson Extrapolation Numerical Differentiation Finite Difference Approximations Automatic Differentiation Finite Difference Approximations Given smooth function f : R → R , we wish to approximate its first and second derivatives at point x Consider Taylor series expansions f ( x + h ) = f ( x ) + f ( x ) h + f 00 ( x ) 2 h 2 + f 000 ( x ) 6 h 3 + ··· f ( x h ) = f ( x ) f ( x ) h + f 00 ( x ) 2 h 2 f 000 ( x ) 6 h 3 + ··· Solving for f ( x ) in first series, obtain forward difference approximation f ( x ) = f ( x + h ) f ( x ) h f 00 ( x ) 2 h + ··· ≈ f ( x + h ) f ( x ) h which is firstorder accurate since dominant term in remainder of series is O ( h ) Michael T. Heath Scientific Computing 49 / 61 Numerical Integration...
View
Full
Document
This note was uploaded on 01/19/2012 for the course CMPSC 451 taught by Professor Staff during the Spring '08 term at Pennsylvania State University, University Park.
 Spring '08
 staff

Click to edit the document details