How to build a custom DL layer that contains a layergraph

7 次查看(过去 30 天)
Hi.
I am trying to build a custom layer representing a single stage in a transformer. For this, I built a layergraph that has the correct layers, connections and dimensions. However, I struggle to make a custom layer from that that I then can use later in a larger network (as I want to stack mulmtiple of these single blocks). I think I am missing the correct approach, as I keep getting errors like this:
"Error using the predict function in layer TransformerBlock" -> "undefined function "forward" for input arguments of type 'nnet.cc.LayerGraph'.
I tried exchanging things with dlArrays(), but that just got more confusing. Any tips from the pros?
Thx!
Here is my code for the transformerBlock layer:
classdef transformerBlock < nnet.layer.Layer % ...
% & nnet.layer.Formattable ... % (Optional)
% & nnet.layer.Acceleratable % (Optional)
properties
end
properties
tfNetwork
end
methods
function obj = transformerBlock( tfName, percentageDropout, layernormEpsilon, numHeads,numKeyChannels, numValueChannels, outputSize )
obj.Name = tfName;
obj.Description = "A transformer block";
lgraph = returnTFBlock( tfName, percentageDropout, layernormEpsilon, numHeads,numKeyChannels, numValueChannels, outputSize );
obj.tfNetwork = lgraph;
end
function Z = predict(obj, X)
% Forward input data X through the internal dlnetwork
dlX = dlarray(X, 'SCB'); % Ensure the input is formatted correctly
Z = forward(obj.tfNetwork, dlX);
end
end
end
Here is the code for making the layergraph:
function lgraph = returnTFBlock( tfName, percentageDropout, layernormEpsilon, numHeads,numKeyChannels, numValueChannels, outputSize )
nameDropoutInput = strcat( tfName, "_input" );
nameSelfAttention = strcat( tfName, "_SelfAttention" );
nameDropout1 = strcat( tfName, "_dropout1" );
nameDropout2 = strcat( tfName, "_dropout2" );
nameAdd1 = strcat( tfName, "_add1");
nameAdd2 = strcat( tfName, "_output");
nameLayernorm1 = strcat( tfName, "_layernorm1" );
nameLayernorm2 = strcat( tfName, "_layernorm2" );
nameConv1d = strcat( tfName, "_conv1d" );
nameGeLu = strcat( tfName, "_gelu" );
lgraph = layerGraph();
tempLayers = [ dropoutLayer( percentageDropout,"Name", nameDropoutInput ) ];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
layerNormalizationLayer("Name",nameLayernorm1,"Epsilon",layernormEpsilon)
selfAttentionLayer(numHeads, numKeyChannels, "Name",nameSelfAttention,"DropoutProbability",percentageDropout,"NumValueChannels",numValueChannels,"OutputSize",outputSize)
dropoutLayer(0.1,"Name",nameDropout1)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = additionLayer(2,"Name",nameAdd1);
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
layerNormalizationLayer("Name",nameLayernorm2,"Epsilon",layernormEpsilon)
convolution1dLayer(1,outputSize,"Name",nameConv1d)
geluLayer("Name",nameGeLu,"Approximation","tanh")
dropoutLayer(percentageDropout,"Name",nameDropout2)];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
additionLayer(2,"Name",nameAdd2)
];
lgraph = addLayers(lgraph,tempLayers);
lgraph = connectLayers(lgraph,nameDropoutInput,nameLayernorm1);
lgraph = connectLayers(lgraph,nameDropoutInput,strcat( nameAdd1, "/in2" ));
lgraph = connectLayers(lgraph,nameDropout1,strcat( nameAdd1, "/in1" ));
lgraph = connectLayers(lgraph,nameAdd1,nameLayernorm2);
lgraph = connectLayers(lgraph,nameAdd1,strcat( nameAdd2, "/in2" ));
lgraph = connectLayers(lgraph,nameDropout2,strcat( nameAdd2, "/in1" ));
end

回答(1 个)

Paras Gupta
Paras Gupta 2024-5-6
Hey Jast,
I understand that you are trying to create a custom layer that itself defines a layer graph. The error provided in the question seems to occur because the layergraph returned by the function 'returnTFBlock' itself does not have a forward method defined.
The error can be resolved by declaring a 'dlnetwork' object as a property of the custom layer and using the object's forward and predict methods. Please refer to the below code snippet for the changes needed to be made in the code for 'transformerBlock' class:
classdef transformerBlock < nnet.layer.Layer & nnet.layer.Formattable % ...
% & nnet.layer.Formattable ... % (Optional)
% & nnet.layer.Acceleratable % (Optional)
% properties
%
% end
properties
Network
end
methods
function obj = transformerBlock( tfName, percentageDropout, layernormEpsilon, numHeads,numKeyChannels, numValueChannels, outputSize )
obj.Name = tfName;
obj.Description = "A transformer block";
lgraph = returnTFBlock( tfName, percentageDropout, layernormEpsilon, numHeads,numKeyChannels, numValueChannels, outputSize );
obj.Network = dlnetwork(lgraph,Initialize=false);
end
function Z = predict(obj,X)
% Predict using network.
net = obj.Network;
dlX = dlarray(X, 'SCB');
net = initialize(net,X);
Z = predict(net,X);
end
function Z = forward(obj,X)
% Forward pass using network.
net = obj.Network;
dlX = dlarray(X, 'SCB');
net = initialize(net,X);
Z = forward(net,X);
end
end
end
You can also refer to the following documentation on 'Network Composition' for more information on the code above:
Hope this answers your query.

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

产品


版本

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by