How can I implement Layernormalization layer to generate C code?
4 次查看(过去 30 天)
显示 更早的评论
I have a trained neural network with a layernormalization Layer and I want to generate the C code of it, but this layer has no support for code generation. Is there any way to implement it as a custom layer or is there another solution?
This is the architecture of the neural network:
layers = [ ...
featureInputLayer(4, "Name", "myFeatureInputLayer", 'Normalization','rescale-symmetric')
fullyConnectedLayer(16, "Name", "myFullyConnectedLayer1","WeightsInitializer","glorot")
layerNormalizationLayer
tanhLayer("Name", "myTanhLayer")
fullyConnectedLayer(8, "Name", "myFullyConnectedLayer4","WeightsInitializer","he")
layerNormalizationLayer
reluLayer
fullyConnectedLayer(2, "Name", "myFullyConnectedLayer6","WeightsInitializer","he")
regressionLayer
];
This is the entry-point function to generate de code:
function out = DLfunction(in) %#codegen
% A persistent object mynet is used to load the series network object.
% At the first call to this function, the persistent object is constructed and
% setup. When the function is called subsequent times, the same object is reused
% to call predict on inputs, thus avoiding reconstructing and reloading the
% network object.
persistent mynet;
if isempty(mynet)
mynet = coder.loadDeepLearningNetwork('NNfinal.mat');
end
% pass in input
out = predict(mynet,in,'MiniBatchSize',4);
NNfinal.mat is the trained neural network
And this is the code I used it, following an example:
dlconfig = coder.DeepLearningConfig(TargetLibrary='none');
cfg = coder.config('lib');
cfg.DeepLearningConfig = dlconfig;
myInput=[4 3 4 4];
codegen -config cfg DLfunction -args {myInput} -report
0 个评论
回答(2 个)
Oguz Kaan Hancioglu
2023-3-30
Have you ever change the codegen target language?
cfg = coder.config('lib');
cfg.TargetLang = 'C';
cfg.DeepLearningConfig = coder.DeepLearningConfig(TargetLibrary = 'none');
Best
4 个评论
Oguz Kaan Hancioglu
2023-3-30
Yes, you are right. normalization layer doesn't supported by the matlab coder. The help doesn't explain anything about the normalization layer. You need to build your architecture without normalization layer to generate C code.
Best
Sergio Matiz Romero
2023-5-9
编辑:Sergio Matiz Romero
2023-5-9
A potential workaround is to use groupNormalizationLayer in 'all-channels' mode instead to layerNormalizationLayer, since they are equivalent. You can find more information on the groupNormalizationLayer modes in the following link:
Below is an example that compares the code generation output of a network using groupNormalizationLayer in 'all-channels' mode with that of an equivalent network using layerNormalizationLayer. For the network using layerNormalizationLayer, we do not generate code (since it is not supported), but we run the entry point function in MATLAB instead. The output values match when comparing them.
rng('default')
cfg = coder.config('mex');
cfg.DeepLearningConfig = coder.DeepLearningConfig('none');
inputSize = [8 8 16];
% Network with layer normalization
layers = [imageInputLayer(inputSize, 'Name', 'input')
layerNormalizationLayer('Name', 'norm')
];
dlnet = dlnetwork(layers);
save('dlnetLayerNorm.mat', 'dlnet');
% Network with group normalization in mode 'all-channels'
layers = [imageInputLayer(inputSize, 'Name', 'input')
groupNormalizationLayer('all-channels', 'Name', 'norm')
];
dlnet = dlnetwork(layers);
save('dlnetGroupNorm.mat', 'dlnet');
in = dlarray(randi(255, inputSize, 'single'), 'SSCB');
codegen mPredict -config cfg -args {in, coder.Constant('dlnetGroupNorm.mat')}
outMat = mPredict(in, 'dlnetLayerNorm.mat');
outCode = mPredict_mex(in, 'dlnetGroupNorm.mat');
diff = max(abs(outMat(:) - outCode(:)));
display(diff)
The entry point function is
function out = mPredict(in, matfile)
%#codegen
net = coder.loadDeepLearningNetwork(coder.const(matfile));
out = net.predict(in);
end
Please let me know if this workaound unblocks your workflow or if you have additional questions
0 个评论
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!