Customized RegressionLayer Loss Function "Mean Root Loss" gives NAN

2 次查看(过去 30 天)
I am using a u-net with input images of size 240x240x35 and with four different output images of size 240x240, so in all the outputlayer is 240x240x4.
Now I want to change the the loss function and I implemented several ones which work quite nice:
MSE or MAE (mean absolut error) and RL are all working. Now I want to have a mean root error (MRE) which is doing the same as MAE but has a sqrt around "abs(Y-T)".
However, when I do this, i get Nan after a few iterations. I have read all the post as (https://de.mathworks.com/matlabcentral/answers/337587-how-to-avoid-nan-in-the-mini-batch-loss-from-traning-convolutional-neural-network) but reducing initial learning rate was not doing anything. Additionally, I have a lot of convolutional layers and I think like this. If MAE is working, then also the sqrt of it should work which only changes the positive real number to an other positive real number...
Can someone help me why I get "Nan" only when using the sqrt of MAE?
Best
Ingo
classdef RegressionLayer_Function < nnet.layer.RegressionLayer
% Example custom regression layer with mean-absolute-error loss.
methods
function layer = RegressionLayer_Function(name)
% layer = maeRegressionLayer(name) creates a
% mean-absolute-error regression layer and specifies the layer
% name.
% Set layer name.
layer.Name = "myRegressionLayer";
% Set layer description.
if strcmp(name,'mse')
layer.Description = 'MSE';
elseif strcmp(name,'rl')
layer.Description = 'RL';
elseif strcmp(name,'mre')
layer.Description = 'MRE';
else
layer.Description = 'MAE';
end
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the MAE loss between
% the predictions Y and the training targets T.
if strcmp(layer.Description,'MAE')
% Calculate MAE.
R = size(Y,3);
meanAbsoluteError = sum(abs(Y-T),3)./R;
% Take mean over mini-batch.
N = size(Y,4);
loss = sum(meanAbsoluteError(:))./N;
elseif strcmp(layer.Description,'MSE')
% Calculate MAE.
R = size(Y,3);
meanSquaredError = sum(abs(Y-T).^2,3)./R;
% Take mean over mini-batch.
N = size(Y,4);
loss = sum(meanSquaredError(:))./N;
elseif strcmp(layer.Description,'MRE')
% Calculate MAE.
R = size(Y,3);
C = sqrt((abs(Y-T)));
meanSquaredError = sum(C,3)./R;
% Take mean over mini-batch.
N = size(Y,4);
loss = sum(meanSquaredError(:))./N;
elseif strcmp(layer.Description,'RL')
% Calculate MAE.
R = size(Y,3);
C = abs(Y-T);
C(T==0) = 0;
T(T==0) = 1;
relativeError = sum( C./T ,3)/R;
% Take mean over mini-batch.
N = size(Y,4);
loss = sum(relativeError(:))./N;
end
end
end

回答(1 个)

Raynier Suresh
Raynier Suresh 2021-2-26
Hi, I tried with few inputs to check the MRE but i could not replicate this issue, Check whether any of your input has NaN's and also check that the network is not generating NaN values.

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by