How to create a initialize function for a custom layer where the learnable parameters have same size of input?

107 次查看(过去 30 天)
I want to form a initialize function inside a custom layer where the learnable parameters have same size as the unknown input size. Is it possible? I understood from Define Custom Deep Learning Layer with Learnable Parameters - MATLAB & Simulink - MathWorks India that this can be acquired by utilizing networkDataLayout objects. For instance while creating the deep learning network matlab will analyze the network by using input having a batch size '1' and later on training it will change based on the batch size that we provide in the training options. Is there any way to initialize the custom layer in that way?

回答(1 个)

Malay Agarwal
Malay Agarwal 2024-9-25,5:34
编辑:Malay Agarwal 2024-9-25,5:35
I am assuming you are using MATLAB R2024b.
You can initialize such a layer by implementing the initialize() method of your custom layer. The initialize() method has two arguments, the layer object itself and an instance of networkDataLayout which represents the input layout of the layer (usually referred to as layout in the method's implementation):
function layer = initialize(layer,layout)
% (Optional) Initialize layer learnable and state parameters.
%
% Inputs:
% layer - Layer to initialize
% layout - Data layout, specified as a networkDataLayout
% object
%
% Outputs:
% layer - Initialized layer
%
% - For layers with multiple inputs, replace layout with
% layout1,...,layoutN, where N is the number of inputs.
% Define layer initialization function here.
end
To access the input size, you can use the Size property of the networkDataLayout object. When you train a network which contains your custom layer, MATLAB will automatically create a networkDataLayout object with the size of the incoming inputs to the layer and pass it to this method for layer initialization.
The example you shared also shows an implementation of the initialize() function where the parameters are initialized based on the channel dimension of the input: https://www.mathworks.com/help/deeplearning/ug/define-custom-deep-learning-layer.html#mw_0679ac65-be66-477c-9a76-912c32c1ab27.
You can adopt the example to use all the dimensions of the input. For example:
classdef customLayer < nnet.layer.Layer
properties (Learnable)
Parameter
end
% Other code
methods
function layer = initialize(layer, layout)
if isempty(layer.Parameter)
Parameter = randn(layout.Size);
end
end
end
end
If you'd like to test the initialization without creating a full-fledged network, you can do something like this:
% Define input size - This will vary based on what your layer does
inputSize = [224 224 3];
% Manually create a networkDataLayout object
layout = networkDataLayout(inputSize, "SSC");
% Create layer
layer = customLayer()
% Manually initialize the layer
layer = initialize(layer, layout);
% Check the size of the parameter
size(layer.Parameter) == inputSize;
Refer to the following resource for more information:
Hope this helps!
  13 个评论
Malay Agarwal
Malay Agarwal 2024-9-25,16:13
编辑:Malay Agarwal 2024-9-25,16:14
I think if you use the layer in a network, you can access the batch dimension just like any other dimension using finddim by passing "B" to the label argument. The issue with NaN that you were facing earlier was only because you were passing NaN as the last dimension in the input size.
inputSize = [3 NaN];
But if you do use the batch dimension, your implementation might not be correct and you may not get the results you're expecting.
BIPIN SAMUEL
BIPIN SAMUEL 2024-9-26,3:47
Yes, may be this is right for higher diamensional input data with format like "SSCB", "SSCBT".... But in this case input is 2D so both rows and column will engage in multiplication operation so it may not affect the efficiency of the network. Anyway I will try that again.

请先登录,再进行评论。

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by