Training Semantic Segmentation out of memory

1 次查看(过去 30 天)
i have error when trying to training my datasets in semantic segmentation using deep learning. this is my code:
doTraining = true;
if doTraining
[net,info] = trainNetwork(pximds,lgraph,options);
save ('D:\TAEvianita\video\tilesdatase\Pretraineddata.mat','net');
disp('NN trained')
else
%load the pre-trained network
data = load('D:\TAEvianita\video\tilesdatase\net_checkpoint__266__2019_12_23__23_44_28.mat')
net = data.net
end
and this is my training options
options = trainingOptions('sgdm', ...
'ExecutionEnvironment','gpu',...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',10,...
'LearnRateDropFactor',0.3,...
'Momentum',0.9, ...
'InitialLearnRate',1e-3, ...
'L2Regularization',0.005, ...
'MaxEpochs',30, ...
'MiniBatchSize',2, ...
'Shuffle','every-epoch', ...
'CheckpointPath','D:\TAEvianita\video\tilesdatase', ...
'VerboseFrequency',2,...
'Plots','training-progress',...
'ValidationPatience', 4
and the following error I got
error using trainiNetwork (line 170), Out of memory. Type "help memory" for your options.
caused by: Out of memory. Try "help memory" for your options.

回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by