Error in TrainMaskrcnn function

13 次查看(过去 30 天)
Claudia De Clemente
回答: James 2023-6-23
Hello everyone,
I am trying to train a Mask RCNN net with [512x512x3] images, stored in a datastore as they should be, with
1) Original image
2) Bounding box
3) Labels
4) Masks
all in a 1x4 array.
Since it is my first attempt, I copied the training options from the example I found on the MATLAB website.
options = trainingOptions("sgdm", ...
InitialLearnRate=0.001, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=1, ...
LearnRateDropFactor=0.95, ...
Plot="none", ...
Momentum=0.9, ...
MaxEpochs=10, ...
MiniBatchSize=1, ...
BatchNormalizationStatistics="moving", ...
ResetInputNormalization=false, ...
ExecutionEnvironment="gpu", ...
VerboseFrequency=50);
However I was forced to set the MiniBatchSize to one because the gpu goes out of memory otherwise (I don't really understand why). Anyway I came across this error:
Error using .*
Arrays have incompatible sizes for this operation.
Error in deep.internal.recording.operations.TimesBroadcastOp/forward (line 31)
x = x .* y;
Error in .* (line 39)
zdata = matlab.lang.internal.move(xdata) .* matlab.lang.internal.move(ydata);
Error in vision.internal.cnn.maskrcnn.CrossEntropy (line 20)
loss = sum( T .* log(nnet.internal.cnn.util.boundAwayFromZero(Y)), 3);
Error in vision.internal.cnn.maskrcnn.MaskRCNNLoss/lossFcn (line 74)
LossRCNNClass = vision.internal.cnn.maskrcnn.CrossEntropy(YRCNNClass, classificationTargets);
Error in images.dltrain.internal.SerialTrainer>modelGradients (line 136)
[loss,lossData] = lossFcn.lossFcn(networkOutputs{:},targets{:});
Error in deep.internal.dlfeval (line 17)
[varargout{1:nargout}] = fun(x{:});
Error in dlfeval (line 40)
[varargout{1:nargout}] = deep.internal.dlfeval(fun,varargin{:});
Error in images.dltrain.internal.SerialTrainer/fit (line 76)
[loss,grad,state,networkOutputs,lossData] = dlfeval(@modelGradients,self.Network,self.LossFcn,...
Error in images.dltrain.internal.dltrain (line 102)
net = fit(networkTrainer);
Error in trainMaskRCNN (line 257)
[network,info] = images.dltrain.internal.dltrain(mbqTrain,network,options,lossFcn,metrics,'Loss', 'ExperimentMonitor',params.ExperimentMonitor);
Do you have any idea of what it means? I haven't found anything similar online, I am lost to say the least. Thank you to whoever will respond.
Have a nice day.
  3 个评论
Claudia De Clemente
Hello, thank you for your response! in the end I found out it was a problem with the number of classes, I had declared more than what i had by mistake. Unfortunately, even if the code seems now correct, I get the following error:
Layer 'res5c_branch2c': Invalid input data. Out of memory on device.
I have an RTX A5000, brand new. I don't understand. There is no help on mask Rcnn on matlab...
Joss Knight
Joss Knight 2023-5-7
It looks like you are still running out of memory, it just isn't being reported very well.
You might need to provide reproduction steps because without seeing the code, specifically, how you have modified the example and the input data, we're only going to be able to speculate as to what is going on.
Typically you would be making a mistake like storing a lot of data on the GPU before you start training, or your network would be (perhaps accidentally) generating very large intermediate outputs or very large weights.

请先登录,再进行评论。

回答(1 个)

James
James 2023-6-23
For those that might find this useful, in my case the issue arose from their being stray categories left over in my datastore from previous processing, even though they had zero entries.
I got a list of categories and their number of entries with this:
labsx=combinedDS_train.UnderlyingDatastores{1,2}.LabelData(:,2);
a=labsx(1);
a=a{1};
for i=2:length(labsx)
b=labsx(i);
b=b{1};
a=[a;b];
end
summary(a)
Then I removed the extra categories by converting each category array to a cell array and back to categorical again:
boxcell=combinedDS_train.UnderlyingDatastores{1,2}.LabelData;
for i=1:length(boxcell)
labs=boxcell{i,2};
labs=cellstr(labs);
labs=categorical(labs);
boxcell{i,2}=labs;
end

产品


版本

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by