mean squared logarithmic error regression layer
9 次查看(过去 30 天)
显示 更早的评论
I'm trying to write a MSLE regression layer
Here is my code:
"
classdef msleRegressionLayer < nnet.layer.RegressionLayer
% Custom regression layer with mean-absolute-logarithmic-error loss.
methods
function layer = msleRegressionLayer(name)
% layer = maleRegressionLayer(name) creates a
% mean-absolute-logarithmic-error regression layer and specifies the layer
% name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = 'Mean squared logarithmic error';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the MSLE loss between
% the predictions Y and the training targets T.
% Calculate MSLE.
R = size(Y,1);
%meanAbsoluteError = sum(abs(Y-T),3)/R;
msle=sum((log10((Y+1)/(T+1))).^2,1)/R;
% Take mean over mini-batch.
N = size(Y,2);
loss = sum(msle,2)/N;
end
function dLdY = backwardLoss(layer, Y, T)
% Returns the derivatives of the MSLE loss with respect to the predictions Y
R = size(Y,1);
N = size(Y,2);
dLdY = 2*(log10(Y+1)-log10(T+1))./(N*R).*1./(Y+1).*ln(10);
end
end
end
"
In this case, size of x_train is 1024 x 500000 and size of Y_train is 1 x 500000.
Any help is wellcome
采纳的回答
VICTOR CATALA
2019-6-27
3 个评论
Erdem AKAGUNDUZ
2020-3-16
Hello Victor,
Nice job with MSLE Loss layer, and thanks.
I have a question actually, and I hope you can help me.
Why do we divide by the mini-batch size (N = size(Y,4)) in the backwardLoss function?
I know the examples in MATLAB help also does this. but I don't undestand it, so I am looking for an answer.
For example:
if the output of the network (that goes in the loss function) is 224x224x1xN
then we expect the size of dLdY to be the same as 224x224x1xN.
So why do we divide by this gradient by N. We did NOT sum over the gradients in the mini-batch dimension. So why average along that dimension?
Thank you very much.
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!