Cody

Problem 485. Fletcher-Reeves Conjugate Gradient Method

Solution 479401

Submitted on 28 Jul 2014 by Goichi
This solution is locked. To view this solution, you need to provide a solution of the same size or smaller.

Test Suite

Test Status Code Input and Output
1   Pass
%% % Rosenbrock's banana function F=@(x) 100*(x(2)-x(1).^2).^2 + (1-x(1)).^2; gradF=@(x) [100*(4*x(1).^3-4*x(1).*x(2))+2*x(1)-2; 100*(2*x(2)-2*x(1).^2)]; x0 = [-1.9; 2.0]; x1=[ -1.4478 2.1184]; x2=[ 1.7064 2.9446]; f1=6.0419; f2=0.6068; [xmin,fmin]=ConjGrad(F,gradF,x0,0.01,1) % single steepest descent assert(norm(xmin-x1)<0.2||norm(xmin-x2)<0.2) assert( abs(fmin-f1)<0.5|| abs(fmin-f2)<0.5) % 2 local min

iter alpha f(alpha) norm(c) 0 0.000 267.6200 1270.8691 beta = 1.2321e-04 1 0.000 6.0719 14.1065 xmin = -1.4452 2.1191 fmin = 6.0719

2   Pass
%% % Rosenbrock's banana function F=@(x) 100*(x(2)-x(1).^2).^2 + (1-x(1)).^2; gradF=@(x) [100*(4*x(1).^3-4*x(1).*x(2))+2*x(1)-2; 100*(2*x(2)-2*x(1).^2)]; x0 = [0; 0]; xcorrect=[ 0.2926 0.0505]; fcorrect=0.6238; [xmin,fmin]=ConjGrad(F,gradF,x0,1e-2,2) % two iterations assert(norm(xmin-xcorrect)<0.1) assert( abs(fmin-fcorrect)<0.01)

iter alpha f(alpha) norm(c) 0 0.000 1.0000 2.0000 beta = 6.7653 beta = 2.0921 2 0.010 0.6238 7.5242 xmin = 0.2926 0.0505 fmin = 0.6238

3   Pass
%% % Rosenbrock's banana function F=@(x) 100*(x(2)-x(1).^2).^2 + (1-x(1)).^2; gradF=@(x) [100*(4*x(1).^3-4*x(1).*x(2))+2*x(1)-2; 100*(2*x(2)-2*x(1).^2)]; x0 = [1.1;0.9]; xcorrect = [1;1]; fcorrect = 0; [xmin,fmin]=ConjGrad(F,gradF,x0) % default 20 iterations assert(norm(xmin-xcorrect)<0.1) assert(abs(fmin-fcorrect)<0.01);

iter alpha f(alpha) norm(c) 0 0.000 9.6200 150.0119 beta = 9.4900e-06 beta = 0.0020 beta = 88.4720 beta = 0.0026 4 0.001 0.0000 0.0099 xmin = 0.9991 0.9982 fmin = 8.1952e-07

4   Pass
%% % Rosenbrock's banana function F=@(x) 100*(x(2)-x(1).^2).^2 + (1-x(1)).^2; gradF=@(x) [100*(4*x(1).^3-4*x(1).*x(2))+2*x(1)-2; 100*(2*x(2)-2*x(1).^2)]; x0 = [0; 0]; xcorrect = [1;1]; fcorrect = 0; [xmin,fmin]=ConjGrad(F,gradF,x0,0.01,100) % Convergence before 100 iterations assert(norm(xmin-xcorrect)<0.1) assert(abs(fmin-fcorrect)<0.01);

iter alpha f(alpha) norm(c) 0 0.000 1.0000 2.0000 beta = 6.7653 beta = 2.0921 beta = 1.4972 beta = 1.0691 beta = 0.1161 beta = 0.0123 beta = 0.3013 beta = 22.9734 beta = 4.4030 beta = 0.3518 beta = 0.0030 beta = 0.0218 12 0.001 0.0001 0.0096 xmin = 1.0106 1.0214 fmin = 1.1304e-04

5   Pass
%% % Rosenbrock's banana function F=@(x) 100*(x(2)-x(1).^2).^2 + (1-x(1)).^2; gradF=@(x) [100*(4*x(1).^3-4*x(1).*x(2))+2*x(1)-2; 100*(2*x(2)-2*x(1).^2)]; x0 = [-1.9; 2]; xcorrect = [1;1]; fcorrect = 0; [xmin,fmin]=ConjGrad(F,gradF,x0,1e-3,200) assert(isequal(round(xmin),xcorrect)) assert(isequal(round(fmin),fcorrect))

iter alpha f(alpha) norm(c) 0 0.000 267.6200 1270.8691 beta = 1.2321e-04 beta = 0.0130 beta = 318.9298 beta = 1.6067 beta = 1.1694 beta = 1.0779 beta = 1.0160 beta = 0.9865 beta = 0.9733 beta = 0.9550 beta = 0.9810 beta = 0.9496 beta = 0.9635 beta = 0.9388 beta = 0.9759 beta = 0.9333 beta = 0.9773 beta = 0.9683 beta = 0.9283 beta = 0.9807 beta = 0.9753 beta = 0.9714 beta = 0.9690 beta = 0.9564 beta = 0.9839 beta = 0.9897 beta = 0.9882 beta = 1.0027 beta = 1.0012 beta = 1.0004 beta = 1.0001 beta = 1.0364 beta = 1.0130 beta = 1.0195 beta = 1.0206 beta = 1.0222 beta = 1.0243 beta = 1.0268 beta = 1.0700 beta = 1.0302 beta = 1.1075 beta = 1.0240 beta = 1.0338 beta = 1.1168 beta = 1.0255 beta = 1.0632 beta = 1.0357 beta = 1.1171 beta = 1.0481 beta = 1.0390 beta = 1.1253 beta = 1.0516 beta = 1.0377 beta = 1.0480 beta = 1.0442 beta = 1.0356 beta = 1.0191 beta = 1.0389 beta = 0.9948 beta = 0.9060 beta = 0.8856 beta = 0.6974 beta = 0.2243 beta = 0.0389 beta = 0.0302 beta = 0.2711 beta = 6.5943 beta = 39.8013 beta = 1.7119 beta = 1.6830 beta = 0.9358 beta = 0.6490 beta = 0.2701 beta = 0.0061 beta = 0.0092 beta = 0.9285 beta = 91.1315 beta = 0.0351 beta = 6.8319e-04 beta = 0.1599 80 0.001 0.0000 0.0005 xmin = 1.0005 1.0011 fmin = 3.0282e-07

Suggested Problems

More from this Author17

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!