Slow bayesopt initialization in parallel computing

3 次查看(过去 30 天)
Hi,
I am using a gaussian process regression model to fit my data through fitrgp function. I self defined a sort of complex kernel. I texted on a small subset of my data and got satisfied result. Now I apply it to my whole data with ‘UseParallel’ on. My questions are following:
1. Say I have 64 cores and set 1 worker per core. With setup as:
'OptimizeHyperparameters','auto', ...
'HyperparameterOptimizationOptions',...
struct('UseParallel', true, ...
'MaxObjectiveEvaluations', 64, ...
'Repartition', true, ...
'kfold', 20));
I understand the optimizer is bayesian opt, but how does the parallel look like? Is each objective function evaluation performed on all 64 workers simultaneously, and 64 evaluations go serial? Or instead, each evolution is computed on one worker, and 64 evaluations on their specific cores are computed in parallel?
2. I noticed, after getting the message:
“Copying objective function to workers...
Done copying objective function to workers.”
It takes quite a while (hours) before printing out further evaluation information. I guess it is because the initialization of bayesopt. Anything I can do to speed it up?
I also noticed, after getting the first evaluation print, the further 63 results are relatively faster. Why is that?
Thank you very much for your help.
Mono

采纳的回答

Don Mathis
Don Mathis 2018-11-30
编辑:Don Mathis 2018-11-30
1. Your second idea was right: "each evaluation is computed on one worker, and 64 evaluations on their specific cores are computed in parallel". So if you have 64 workers and set MaxObjectiveEvaluations=64, you will get 64 function evaluations performed at the same time, each using 1 worker and 1 core, and then the optimization will stop. Those 64 function evaluations will be run on random hyperparameter values, so bayesopt will never fit models or do any intelligent choosing of points.
2. After it says “Copying objective function to workers...Done copying objective function to workers”, your 64 function evaluations begin running in parallel. If it takes hours before you see anything else, then apparently, running a single one of your function evaluations takes hours. The evaluations are run asynchronously, so the next thing you will see are the results of the first one that finishes. Then, as each job finishes, you will see more and more reports, each with gradually increasing runtimes. The reason the subsequent reports appear faster than the first one appeared is that those jobs have been running the whole time, and so are probably nearly done at the time the first job finishes.
To speed things up, I would suggest reducing 'kfold' from 20 to 5.

更多回答(1 个)

Alan Weiss
Alan Weiss 2018-11-27
Perhaps the documentation in Parallel Bayesian Optimization can help. In particular, there are descriptions of several methods of getting the objective function onto the workers.
Also, and I do not know if this is what you are seeing, most MATLAB operations are faster the second time they are executed because MATLAB, an interpreted language, compiles on the fly and uses the compiled code in later calls.
Alan Weiss
MATLAB mathematical toolbox documentation
  3 个评论
mono
mono 2018-11-28
update of the information.
it is indeed the copying process takes the time. I notice if I pause the code, it stops inside fevalFuture.m file and it takes hours to stay inside there.
Next, I will investiage how to copy the objective functions to workers more efficiently. But still seems fitrgp documentation page doesn't mention much about it and bayesopt is wrapped inside it.
Alan Weiss
Alan Weiss 2018-11-28
The Bayesian optimization features in fitgrp are not as flexible or extensible as those in bayesopt itself. You will probably have to use bayesopt to do what you want most efficiently. And I hope that you noticed the descriptions of various ways of getting the objective function onto the workers. Some can save considerable time, especially for repeated optimizations, where you can leave the objective functions on the workers and just change some data.
Alan Weiss
MATLAB mathematical toolbox documentation

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Model Building and Assessment 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by