Info

此问题已关闭。 请重新打开它进行编辑或回答。

Hi everybody, i am looking for a solution for comparison of convergence time between two algorithms, two optimization problem actually. Is there any toolbox or sample code?

7 次查看(过去 30 天)
Calc convergence time of two algorithm and find out which one is better!

回答(1 个)

Walter Roberson
Walter Roberson 2020-11-5
Niters = 5;
x = 1 : 50;
y = besselj(3, x) + randn(size(x))/20;
nvars = 2;
model1 = @(AB, x) AB(1).*x.^AB(2); %a*x^b
model2 = @(AB, x) x.^3 + AB(1).*x.^2 + AB(2).*log(x); %x^3 + a*x^2 + b*log(x);
residue1 = @(AB) sum((model1(AB, x) - y).^2);
residue2 = @(AB) sum((model2(AB, x) - y).^2);
models = {residue1, residue2};
modnames = {'a*x^b', 'x^3 + a*x^2 + b*log(x)'};
Nmodels = length(models);
x0 = randn(1,nvars) * 10;
fminunc_options = optimoptions(@fminunc, 'MaxIterations', 1e5);
ga_options = optimoptions(@ga, 'MaxGenerations', 1e5);
A = []; b = []; Aeq = []; beq = []; lb = []; ub = []; nlcon = [];
alg1 = @(FUN) fminunc( FUN, x0, fminunc_options );
alg2 = @(FUN) ga(FUN, nvars, A, b, Aeq, beq, lb, ub, nlcon, ga_options);
algs = {alg1, alg2};
algnames = {'fminunc', 'ga'};
itercount_fminunc = @(info) info.funcCount;
itercount_ga = @(info) info.funccount;
itercount_funs = {itercount_fminunc, itercount_ga};
Nalgs = length(algs);
timings = zeros(Niters, Nalgs, Nmodels);
fvals = zeros(Niters, Nalgs, Nmodels);
infos = cell(Niters, Nalgs, Nmodels);
bestAB = cell(Niters, Nalgs, Nmodels);
itercounts = zeros(Niters, Nalgs, Nmodels);
for modidx = 1 : Nmodels
thismodel = models{modidx};
for algidx = 1 : Nalgs
thisalg = algs{modidx};
itercount_fun = itercount_funs{modidx};
for iter = 1 : Niters
tic;
[bestAB{iter, algidx, modidx}, fvals(iter, algidx, modidx), ~, infos{iter, algidx, modidx}] = thisalg(thismodel);
timings(iter, algidx, modidx) = toc;
itercounts(iter, algidx, modidx) = itercount_fun(infos{iter, algidx, modidx});
end
end
end
Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance. Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
meantimes = permute(mean(timings), [2 3 1]);
meaniters = permute(mean(itercounts), [2 3 1]);
t = table(meaniters(:), meantimes(:), repmat(modnames(:), Nalgs, 1), repelem(algnames(:), Nmodels, 1));
t.Properties.VariableNames = {'iterations', 'time [s]', 'model', 'algorithm'};
disp(t)
iterations time [s] model algorithm __________ _________ __________________________ ___________ 42 0.093775 {'a*x^b' } {'fminunc'} 42 0.0073104 {'x^3 + a*x^2 + b*log(x)'} {'fminunc'} 39900 0.84803 {'a*x^b' } {'ga' } 40915 0.56368 {'x^3 + a*x^2 + b*log(x)'} {'ga' }

此问题已关闭。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by