dlgradient: Error Value to differentiate must be a traced dlarray scalar.
43 次查看(过去 30 天)
显示 更早的评论
I want to train a deep network by Automatic Differentiation. Is these any solution?
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(loss,dlnet.Learnables);

0 个评论
采纳的回答
Anshika Chaurasia
2021-1-18
Hi Qi Lu,
You can try following code to compute gradients that will resolve your error:
layer2 = [
imageInputLayer([9 36 1],'Normalization','none','Name','input1-fcc')
convolution2dLayer([7,7],64,'Name','conv1-fcc')
batchNormalizationLayer('Name','bn1-fcc')
reluLayer('Name','relu1-fcc')
globalAveragePooling2dLayer('Name','pool5-fcc')
fullyConnectedLayer(1,'Name','fc1')];
lgraph = layerGraph(layer2);
dlnet = dlnetwork(lgraph);
% Input
a = rand(9,36,1,10);
a = dlarray(a,'SSCB');
[loss,gradients] = dlfeval(@compute_gradient,dlnet,a);
function [loss,gradients]=compute_gradient(dlnet,a)
a_pre = forward(dlnet,a);
% output
b = rand(1,10);
loss = mse(a_pre,b);
gradients = dlgradient(dlarray(loss),dlnet.Learnables);%automatic gradient
end
Refer to the following documentation for more information on Automatic Differentiation.
0 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Operations 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!