p103_7_S10

p103_7_S10 - EE103 Spring 2010 Lecture Notes (SEJ) Section...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
EE103 Spring 2010 Lecture Notes (SEJ) Section 7 SECTION 7: INTRODUCTION TO OPTIMIZATION: NONLINEAR LEAST SQUARES, THE NEWTON AND GAUSS NEWTON METHODS. .............................................................................................................. 134 NEWTON’S METHOD FOR SQUARE SYSTEMS OF NONLINEAR EQUATIONS ............................................................ 134 Newton’s Algorithm ......................................................................................................................... 135 NON SQUARE SYSTEMS ( mn ) THE GAUSS NEWTON METHOD: . .................................................................. 138 Gauss Newton Algorithm. ................................................................................................................. 139 INTRODUCTION TO OPTIMIZATION AND NONLINEAR LEAST SQUARES .................................................................. 142 PHILOSOPHY OF UNCONSTRAINED NONLINEAR OPTIMIZATION .......................................................................... 143 THE NEWTON METHOD FOR OPTIMIZATION ................................................................................................... 146 Numerical Example of Steepest Decent vs. Newton’s Method ....................................................... 147 Newton’s Method Applied to Nonlinear Least squares .................................................................. 148 A Modification: The Gauss Newton Method Revisited ................................................................... 149 Summary of Optimization Methods ................................................................................................ 151 Steepest Descent Algorithm (general idea). .................................................................................................. 151 Newton Optimization Algorithm (general idea). ........................................................................................... 151 Tutorial Codes (i.e., for classroom purposes) . .............................................................................................. 152 © Copyright Stephen E Jacobsen, 2010
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
EE103 Spring 2010 Lecture Notes (SEJ) Section 7 134 SECTION 7: INTRODUCTION TO OPTIMIZATION: NONLINEAR LEAST SQUARES, THE NEWTON AND GAUSS-NEWTON METHODS This Section introduces the student to Newton’s method for solving a system of nonlinear equations, where the number of variables is equal to the number of equations. Subsequently, we consider the case where there are more nonlinear equations than there are variables. As in the linear case ( , Ax b m n ), this leads us to consider the nonlinear least squares problem. The latter problem often arises when considering the estimation of parameters for known functional forms. Newton’s Method for Square Systems of Nonlinear Equations We are interested in a numerical method for computing a solution of the system of nonlinear equations    112 212 12 ,,, 0 , , , 0 0 n n nn fxx x x x Let T n x xx x and let T n f ff f . We have     ,, , T n f xf x f x f x and we abbreviate the above system, using vector notation, to   0 fx an n x n system of nonlinear equations. Given a point , T kk k k n x x , we use the 1 st order approximation for each of the functions; that is            11 1 22 2 k k k n f x f x x x f x f x x x f x f x x x  or in vector-matrix notation:
Background image of page 2
EE103 Spring 2010 Lecture Notes (SEJ) Section 7 135           1 2 k k kk k n k f fx f xf x x x fx fx J x x x     As in the single variable case, we solve the (system of) linear equations      0 k f  or, assuming   k f Jx is nonsingular, we write ( notation, not computation! )   1 1 1 k f k k f xx J x x xJ x f x        1 Nf gx x J x f x  (i.e., we may think of this method as a fixed point method).
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 22

p103_7_S10 - EE103 Spring 2010 Lecture Notes (SEJ) Section...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online