Calling GA() with 10 generations 10x vs GA() with 100 generations.

1 次查看(过去 30 天)
I'm running GA optimization on a problem, and I want a way to save intermediate results because of potential crashes and early exiting if desired.
I've been running GA like this:
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',10);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);
for idx = 1:9
file_name = 'dummy';% Excluded file naming code for brevity
save(file_name,'final_pop','final_scores');
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',10,'InitialPopulationMatrix',final_pop,'InitialScoresMatrix',final_scores);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);
end
Clarification edit: The options with each call after the first gives ga() the final population and scores from the previous ga() run with the option in optimoptions(): 'InitialPopulationMatrix',final_pop,'InitialScoresMatrix',final_scores
Is there any difference in the results GA will give (besides due to different seeding) vs running with 100 generations?
options = optimoptions('ga','PlotFcn', @gaplotbestf,'Display','iter','PopulationSize',150,'MaxGenerations',100,'UseParallel',true,'SelectionFcn',{@selectiontournament,2},'EliteCount',13);
[x,fval,exitflag,output,final_pop,final_scores] = ga(fun,nvars,[],[],[],[],lb,ub,[],[],options);

回答(1 个)

John D'Errico
John D'Errico 2024-4-17
编辑:John D'Errico 2024-4-17
Is there any difference? Of course! Make it more extreme yet. Suppose you were to stop after 1 iteration. ONLY 1. But do it 100 times. Each call will not come even remotely close to convergence. So you will get random crapola 100 times. Surely that is different when compared to allowing the optimizer to run to the point where it has converged, but doing it only once?
Even were you to take the average of 100 sets of randomly unconverged crap will still probably not be very good.
  2 个评论
Simon Perales
Simon Perales 2024-4-17
I should have clarified, the 9 calls after the original should continue right where it left off by taking the inputs: 'InitialPopulationMatrix',final_pop,'InitialScoresMatrix',final_scores
It does converge, but I was wondering if there is anything underlying about the ga algorithm that makes continuing in a new ga() call any different than doing the full 100 generations in 1 call
John D'Errico
John D'Errico 2024-4-17
Ok. That is different.
GA does not generate anything in the form of a hessian matrix, and that is how a memory would be employed. As such, I think the two cases would now be similar, that is as long as you can start out with an initial population that is identical to where GA ended up last. I don't know they would be identical of course. The random seed state would influence things, so if that seed managed to get touched between calls, all bets are off.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Problem-Based Optimization Setup 的更多信息

产品


版本

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by