- https://in.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html
- https://in.mathworks.com/help/deeplearning/ug/define-custom-classification-output-layer.html
- https://in.mathworks.com/help/deeplearning/ref/dlarray.html
What is wrong with my implementation of the binary cross-entropy output layer?
19 次查看(过去 30 天)
显示 更早的评论
I am trying to make a a neural network with one output node, i.e. binary classification. To do so, I use a sigmoidLayer followed by a custom binary cross-entropy output layer. I tried using the built-in cross-entropy function for the forward loss which should have automatic computation of the backprop, but that did not work. I tried doing the forward and backward functions manually, as follows
classdef binaryClassificationLayer < nnet.layer.ClassificationLayer %...
% & nnet.layer.Acceleratable % (Optional)
properties
% (Optional) Layer properties.
% Layer properties go here.
end
methods
function layer = binaryClassificationLayer(name)
% (Optional) Create a myClassificationLayer.
% Set layer name
layer.Name = name;
% Set layer description
layer.Description = 'Binary cross-entropy';
end
function loss = forwardLoss(layer,Y,T)
% Return the binary cross-entropy loss between the predictions Y and the training
% targets T.
%
% Inputs:
% layer - Output layer
% Y – Predictions made by network
% T – Training targets
%
% Output:
% loss - Binary cross-entropy loss between Y and T
% loss=crossentropy(Y,T,'TargetCategories','independent');
% loss=crossentropy(net,T,Y);
N=length(Y);
loss=-1/N*(T'*log(Y)+(1-T)'*log(1-Y));
end
function dLdY = backwardLoss(layer,Y,T)
% (Optional) Backward propagate the derivative of the loss
% function.
%
% Inputs:
% layer - Output layer
% Y – Predictions made by network
% T – Training targets
%
% Output:
% dLdY - Derivative of the loss with respect to the
% predictions Y
dLdY=zeros(size(Y));
for i=1:length(Y)
dLdY(i)=-T(i)/Y(i)+(1-T(i))/(1-Y(i));
end
end
end
end
but I still get errors. Please help. What am I doing wrong?
0 个评论
回答(1 个)
Saarthak Gupta
2023-9-7
Hi,
I understand that you are trying to implement a custom classification layer with cross-entropy loss.
The ‘crossentropy’ function from the Deep Learning Toolbox, should suffice for your purpose. However, when the predictions (Y) are given as a numeric array or an unformatted ‘dlarray’, you need to specify the dimension order of the unformatted input data using ‘DataFormat’ argument for the ‘crossentropy’ function to work correctly.
Consider the following example which implements a custom binary classification layer with cross-entropy loss:
classdef binaryClassificationLayer < nnet.layer.ClassificationLayer ...
% & nnet.layer.Acceleratable
methods
function layer = binaryClassificationLayer(name)
% error classification layer and specifies the layer name
% Set layer name.
layer.Name = name;
end
function loss = forwardLoss(layer, Y, T)
loss = crossentropy(Y,T,'DataFormat','SSCB');
end
end
end
The ‘DataFormat’ argument in the ‘crossentropy’ function is specified as ‘SSCB’ which is the dimension order typically used for 2-D image data. Depending on your input data, you may need to specify a different dimension order. The above implementation works without any errors.
Please refer to the following MATLAB documentation for more details:
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!