hw4sol - EGM6365 Homework#4 1. Make a Matlab script to...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EGM6365 Homework#4 1. Make a Matlab script to solve the following unconstrained optimization problem using a Newton method. Minimize f ( x) = 1 + 2( x + 1)2 − x 3 + e x Use the initial point x = 6 and -6. The iteration should stop when the function gradient is less than 10−6 . In the Matlab program, plot the curve of the function between x = [-6, 6] using a solid line. At each iteration, plot the function value as dot so that we can see how the algorithm finds the optimum point. Submit you Matlab script, a table of iteration history ( x ( k ) , ∆x ( k ) , f ( x ( k ) ), ∇f ( x ( k ) ), H ( x ( k ) ) ) for each iteration k, and graphic plot. Figure. Iteration history when x(0) = 6 x(0) = 6 Iter= 1 Iter= 2 Iter= 3 Iter= 4 Iter= 5 Iter= 6 Iter= 7 Iter= 8 Iter= 9 Iter=10 x= x= x= x= x= x= x= x= x= x= 5.1292 4.3237 3.5637 2.7012 -0.2029 -0.8462 -0.7301 -0.7258 -0.7258 -0.7258 dx= dx= dx= dx= dx= dx= dx= dx= dx= dx= -0.8708 -0.8055 -0.7600 -0.8625 -2.9041 -0.6432 0.1161 0.0042 0.0000 0.0000 f=110.0771 f= 52.3215 f= 32.6893 f= 23.5860 f= 3.0954 f= 2.0822 f= 2.0167 f= 2.0166 f= 2.0166 f= 2.0166 df=323.4288 df=114.4770 df= 40.6784 df= 15.4483 df= 7.8129 df= 3.8812 df= -1.1035 df= -0.0374 df= -0.0000 df= -0.0000 H=371.4288 H=142.1118 H= 53.5243 H= 17.9111 H= 2.6903 H= 6.0338 H= 9.5060 H= 8.8623 H= 8.8390 H= 8.8390 x= x= x= x= x= x= x= -2.8003 -1.3304 -0.8105 -0.7280 -0.7258 -0.7258 -0.7258 dx= dx= dx= dx= dx= dx= dx= 3.1997 1.4699 0.5199 0.0825 0.0021 0.0000 0.0000 f= 29.5008 f= 3.8375 f= 2.0489 f= 2.0167 f= 2.0166 f= 2.0166 f= 2.0166 df=-127.9975 df=-30.6646 df= -6.3672 df= -0.7681 df= -0.0189 df= -0.0000 df= -0.0000 H= 40.0025 H= 20.8624 H= 12.2468 H= 9.3076 H= 8.8507 H= 8.8390 H= 8.8390 x(0) = -6 Iter= Iter= Iter= Iter= Iter= Iter= Iter= 1 2 3 4 5 6 7 Both cases converged to the same solution. % % % % Newton's line search method That is, one domensional unconstrained minimization using Newton's method (or Newton Raphson method for finding the root of df/dx = 0. clear all clc clf axis normal eps = 1.E-6; % Initial guess x(1) = 6; % Modification factor for the modified Newton's % methos. Alpha = 1, is the regular Newton's method. alpha = 1; % Plot the function first px = -6:0.01:6; pf = 1 + 2*(px+1).^2 - px.^3 + exp(px); plot(px,pf,'-r'); hold on % Begin Newton's method % Initialize the iteration number to 1 i = 1; % n in the for loop below sets the number of % iterations for n = 1:50, % Compute the first derivative at current x df = 4*(x(i)+1) - 3*x(i)^2 + exp(x(i)); % Compute the second derivative at current x d2f = 4 -6*x(i) + exp(x(i)); % Increase iteration number i = i + 1; % Update x using (modified) Newton's method delx = - alpha*df/d2f; x(i) = x(i-1) + delx; % Compute f(x) for plotting purposes f = 1 + 2*(x(i)+1)^2 - x(i)^3 + exp(x(i)); % Plot the new iterate plot(x(i),f,'b.'); % Pause so that you can see in the plot how the % algorithm is (not) making progress fprintf(1,'\nIter=%2d x=%8.4f dx=%8.4f f=%8.4f 1,x(i),delx,f,df,d2f); pause % Convergence check if abs(delx) < eps break; end end hold off df=%8.4f H=%8.4f',i- 2. Try to solve the following unconstrained optimization problem using a Newton method Minimize f ( x) = 1 + 2 x 2 − x 3 + e x Discuss any difficulty and explain. x(0) = -6 Iter= Iter= Iter= Iter= Iter= Iter= Iter= 1 2 3 4 5 6 7 x= x= x= x= x= x= x= -2.7003 -1.0915 -0.3929 -0.1996 -0.1832 -0.1831 -0.1831 dx= dx= dx= dx= dx= dx= dx= 3.2997 1.6088 0.6986 0.1933 0.0164 0.0001 0.0000 x= 5.1400 x= 4.3614 x= 3.6720 x= 3.0349 x= 2.2296 x= 42.8281 x= 41.8281 dx= dx= dx= dx= dx= dx= dx= -0.8600 -0.7786 -0.6895 -0.6370 -0.8053 40.5984 -1.0000 f= 35.3389 f= 5.0187 f= 2.0444 f= 1.9067 f= 1.9059 f= 1.9059 f= 1.9059 df=-131.9975 df=-32.6082 df= -7.6042 df= -1.3593 df= -0.0986 df= -0.0007 df= -0.0000 H= 40.0025 H= 20.2688 H= 10.8846 H= 7.0323 H= 6.0165 H= 5.9316 H= 5.9310 f= f= f= f= f= f= f= df=319.4288 df=112.0170 df= 38.7488 df= 13.5669 df= 5.3069 df= 3.3012 df= 3.98E18 H=371.4288 H=143.8758 H= 56.2007 H= 21.2972 H= 6.5900 H= -0.0813 H=3.98E18 x(0) = 6 Iter= Iter= Iter= Iter= Iter= Iter= Iter= 1 2 3 4 5 6 7 88.7582 34.4499 17.7854 12.2670 9.1549 3.98E19 1.46E19 Diverged. The solution diverged because at iteration 6 the Hessian matrix is negative and a small number. 3. Solve Problem 2 using “fminunc” function in the Matlab. Submit all iteration information. option = optimset('GradObj','on','Hessian','on','Display','iter'); x0 = -6; [x,fval,exitflag,output,grad,hessian]=fminunc('myfun',x0,option) function [f,g,h] = myfun(x) f = 1 + 2*(x)^2 - x^3 + exp(x); if nargout > 1 g = 4*(x) - 3*x^2 + exp(x); if nargout > 2 h = 4 -6*x + exp(x); end end x(0) = 6 Norm of First-order Iteration f(x) step optimality CG-iterations 1 260.429 1 319 0 2 88.7582 0.86 112 1 3 34.4499 0.778567 38.7 1 4 17.7854 0.689471 13.6 1 5 12.267 0.637028 5.31 1 6 9.15492 0.805291 3.3 1 7 9.15492 10 3.3 1 8 1.92905 2.5 0.538 0 9 1.9059 0.0841946 0.0185 1 10 1.90587 0.00310805 2.5e-05 1 11 1.90587 4.2094e-06 4.58e-11 1 Optimization terminated successfully: Relative function value changing by less than OPTIONS.TolFun x = -0.1831 fval = 1.9059 exitflag = 1 output = iterations: funcCount: cgiterations: firstorderopt: algorithm: 11 11 9 4.5780e-11 'large-scale: trust-region Newton' grad = -4.5780e-11 hessian = 5.9310 x(0) = -6 Norm of First-order Iteration f(x) step optimality CG-iterations 1 289.002 1 132 0 2 35.3389 3.29973 32.6 1 3 5.01867 1.60879 7.6 1 4 2.04444 0.698617 1.36 1 5 1.90669 0.193299 0.0986 1 6 1.90587 0.016393 0.000696 1 7 1.90587 0.000117257 3.55e-08 1 Optimization terminated successfully: Relative function value changing by less than OPTIONS.TolFun x= -0.1831 fval = 1.9059 exitflag = 1 output = iterations: 7 funcCount: 7 cgiterations: 6 firstorderopt: 3.5523e-08 algorithm: 'large-scale: trust-region Newton' grad = -3.5523e-08 hessian = 5.9310 ...
View Full Document

Ask a homework question - tutors are online