Can I use parallel computing to train a DNN?

2 次查看(过去 30 天)
Contained below is my code for a Neural Network I have designed. The network training runs fine on a single CPU but it is very slow, is it possible to use a parallel set up? I have made what i think are the required modifications but i get an error (also contained below), what is meant by a recurrent network and is there any way to overcome this error?
Error using trainNetwork (line 150)
Parallel training of recurrent networks is not supported. 'ExecutionEnvironment' value in trainingOptions function must be 'auto', 'gpu', or 'cpu'.
Error in Network2 (line 49)
network2 = trainNetwork(TrainingInputData,TrainingOutputData,layers,options)
Caused by:
Error using nnet.internal.cnn.assembler.setupExecutionEnvironment (line 17)
Parallel training of recurrent networks is not supported. 'ExecutionEnvironment' value in trainingOptions function must be 'auto', 'gpu', or
'cpu'.
clc
clear
parpool
[outputdata, inputdata,specifications] = dataprep();
TrainingOutputData = outputdata;
TrainingInputData = inputdata;
TestingOutputData = cell(1,1);
TestingInputData = cell(1,1);
TestingOutputData{1,1} = outputdata{4,1};
TestingInputData{1,1} = inputdata{4,1};
TrainingOutputData(4,:)=[];
TrainingInputData(4,:)=[];
maxEpochs = 10;
miniBatchSize = 1;
Neurons = 150000;
numResponses = size(TrainingOutputData{1},1);
featureDimension = size(TrainingInputData{1},1);
layers = [ ...
sequenceInputLayer(featureDimension)
fullyConnectedLayer(Neurons)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'InitialLearnRate',0.01, ...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',3,...
'LearnRateDropFactor',0.1,...
'GradientThreshold',1, ...
'Shuffle','never', ...
'Plots','training-progress',...
'ValidationData',{TestingInputData,TestingOutputData},...
'ValidationFrequency', 1,...
'Verbose', 1,...
'VerboseFrequency', 1,...
'ExecutionEnvironment', 'parallel');
network2 = trainNetwork(TrainingInputData,TrainingOutputData,layers,options)

采纳的回答

Joss Knight
Joss Knight 2019-2-15
No, you can't use parallel training for a sequence network, sorry.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by