How to set input of the layer when defining Nested deep learning layer
2 次查看(过去 30 天)
显示 更早的评论
I am followig this example to create a costum Deep Learning Layer consisting of existing layer:
It seems that this example hasn't been validated as there are multiple typos in the code. Anyway, as the example shows, the internal network graph has two branches (one optional). But when you look into where the network is defined, there is no way of connecting the input of the optional branch to the input of the newly defined layer.
layers = [
convolution2dLayer(3,numFilters,Padding="same",Stride=stride)
groupNormalizationLayer("all-channels")
reluLayer
convolution2dLayer(3,numFilters,Padding="same")
groupNormalizationLayer("channel-wise")
additionLayer(2,Name="add")
reluLayer)];
lgraph = layerGraph(layers);
% Add skip connection.
if includeSkipConvolution
layers = [
convolution2dLayer(1,numFilters,Stride=,stride)
groupNormalizationLayer("all-channels",Name="gnSkip")];
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,'gnSkip','add/in2');
end
In this code, the line "lgraph = connectLayers(lgraph,'gnSkip','add/in2'); " connects the output of the optional branch to the input of the adder but how should I connect the input of the optional branch to the input of the layer (as there is no previous layer here)? The Matlab example is incomplete as you would also see by running the checkLayer command on the created layer.
Any help will be appreciated. Thanks.
0 个评论
回答(1 个)
Sandeep
2023-5-26
Hi Mirko Prezioso,
To connect the input of the optional branch to the input of the newly defined layer, you can simply add a separate input layer to the definition of the nested layer, then connect that input layer to the first layer of the existing network. we can define a separate input layer using imageInputLayer() with the appropriate input size and name. Then define the internal network and optional skip-connection in a similar manner as the example in the MATLAB documentation. A sample implementation is given below,
% sample implementation
function layer = nestedConvBlock(numFilters, includeSkipConvolution, stride)
% Define the internal network
convBlock = [
convolution2dLayer(3, numFilters, 'Padding', 'same', 'Stride', stride)
batchNormalizationLayer
reluLayer
convolution2dLayer(3, numFilters, 'Padding', 'same')
batchNormalizationLayer];
% Define the optional skip-connection
if includeSkipConvolution
skipConvolution = [
convolution2dLayer(1, numFilters, 'Stride', stride, 'Name', 'skipConv')
batchNormalizationLayer('Name', 'bnSkip')];
end
% Define the nested layer with input connectivity
input_layer = imageInputLayer([32 32 3], 'Name', 'input');
output_layer = convolution2dLayer(3,16, 'Padding', 'same');
lgraph = layerGraph(input_layer);
lgraph = addLayers(lgraph, convBlock);
lgraph = connectLayers(lgraph, 'input', 'convBlock/in');
if includeSkipConnection
lgraph = addLayers(lgraph, skipConvolution);
lgraph = connectLayers(lgraph, 'input', 'skipConv/in');
lgraph = connectLayers(lgraph, 'convBlock/bn_2', 'skipConv/bnSkip');
lgraph = connectLayers(lgraph, 'convBlock/bn_2', 'sum/in2');
layer = dlnetwork(lgraph, 'OutputNames', 'sum');
else
lgraph = connectLayers(lgraph, 'convBlock/bn_2', 'addition/in2');
layer = dlnetwork(lgraph, 'OutputNames', 'addition');
end
Keep in mind that this is just one way to connect the input of an optional branch to the input of a nested deep learning layer. Depending on the specifics of your implementation, there may be other ways to do this.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Image Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!