1.9781611973655.ch4 - Downloaded to 69.91.157.97 Redistribution subject to SIAM license or copyright see http\/www.siam.org\/journals\/ojsa.php NLO page 49

1.9781611973655.ch4 - Downloaded to 69.91.157.97...

This preview shows page 1 - 3 out of 33 pages.

Chapter 4The Gradient Method4.1Descent Directions MethodsIn this chapter we consider the unconstrained minimization problemmin{f(x):xn}.We assume that the objective function is continuously differentiable overn. We havealready seen in Chapter 2 that a first order necessary optimality condition is that the gra-dient vanishes at optimal points, so in principle the optimal solution of the problem canbe obtained by finding among all the stationary points offthe one with the minimalfunction value. In Chapter 2 several examples were presented in which such an approachcan lead to the detection of the unconstrained global minimum off, but unfortunatelythese were exceptional examples. In the majority of problems such an approach is not im-plementable for the following reasons: (i) it might be a very difficult task to solve the setof (usually nonlinear) equationsf(x) =0; (ii) even if it is possible to find all the station-ary points, it might be that there are infinite number of stationary points and the task offinding the one corresponding to the minimal function value is an optimization problemwhich by itself might be as difficult as the original problem. For these reasons, insteadof trying to find an analytic solution to the stationarity condition, we will consider aniterative algorithm for finding stationary points.The iterative algorithms that we will consider in this chapter take the formxk+1=xk+tkdk,k=0,1,2,...,wheredkis the so-calleddirectionandtkis thestepsize. We will limit ourselves to “descentdirections,” whose definition is now given.Definition 4.1 (descent direction).Let f:nbe a continuously differentiable func-tion overn. A vector0̸=dnis calleda descent directionof f atxif the directionalderivative f(x;d)is negative, meaning thatf(x;d) =f(x)Td<0.The most important property of descent directions is that taking small enough stepsalong these directions lead to a decrease of the objective function.49Downloaded 01/06/17 to 69.91.157.97. Redistribution subject to SIAM license or copyright; see
50Chapter 4. The Gradient MethodLemma 4.2 (descent property of descent directions).Let f be a continuously differen-tiable function overn, and letxn. Suppose thatdis a descent direction of f atx. Thenthere existsϵ>0such thatf(x+td)<f(x)for any t(0,ϵ].Proof.Sincef(x;d)<0, it follows from the definition of the directional derivative thatlimt0+f(x+td)f(x)t=f(x;d)<0.Therefore, there exists anϵ>0 such thatf(x+td)f(x)t<0for anyt(0,ϵ], which readily implies the desired result.We are now ready to write in a schematic way a general descent directions method.Schematic Descent Directions MethodInitialization:Pickx0narbitrarily.

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture