Is a Bi-GRU available - bidirectional Gated Recurrent Unit (GRU) - or a way to implement a Bi-GRU?

77 次查看(过去 30 天)
The following artificial recurrent neural network (RNN) architectures are available:
Whereas, I would like to know if an Bi-GRU exists or can be defined?
Thank you for your help.

采纳的回答

Amanjit Dulai
Amanjit Dulai 2021-10-21
编辑:Amanjit Dulai 2022-11-22
A bi-LSTM layer works by applying two LSTM layers on the data; one in the forward direction and one in the reverse direction. You can apply an LSTM function in the reverse direction by flipping the data. The results from these two LSTM layers is then concatenated together to form the output of the bi-LSTM layer. So if we want to implement a bi-GRU layer, we can do this by using a custom flip layer together with GRU layers. A custom flip layer can be implemented as follows:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end
We can implement a bi-GRU layer with OutputMode="sequence" by arranging layers in the way shown below:
We can implement a bi-GRU layer with with OutputMode="last" by arranging layers in the way shown below:
Below is a short example showing how to use the custom flip layer mentioned above to implement a network with two bi-GRU layers (one with OutputMode="sequence" and oine with OutputMode="last"):
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9)
softmaxLayer
classificationLayer()] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)
  7 个评论
Amanjit Dulai
Amanjit Dulai 2022-6-21
The third dimension is the time dimension. For a bi-GRU, we want one GRU layer to operate on the sequence in the forward time direction, and then we want one GRU layer to operate on the sequence in the reverse time direction. We can get it to operate on the sequence in the reverse time direction by flipping in the third dimension.

请先登录,再进行评论。

更多回答(5 个)

Ronny Guendel
Ronny Guendel 2021-10-22
The full working code for me:
[XTrain, YTrain] = japaneseVowelsTrainData;
lg = layerGraph();
lg = addLayers(lg, [
sequenceInputLayer(12, "Name", "input")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru1")
concatenationLayer(1, 2, "Name", "cat1")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru3")
concatenationLayer(1, 2, "Name", "cat2")
fullyConnectedLayer(9, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'classoutput')] );
lg = addLayers( lg, [
FlipLayer("flip1")
gruLayer( 100, 'OutputMode', 'sequence', ...
"Name", "gru2" )
FlipLayer("flip2")] );
lg = addLayers(lg, [
FlipLayer("flip3")
gruLayer( 100, 'OutputMode', 'last', ...
"Name", "gru4" )] );
lg = connectLayers(lg, "input", "flip1");
lg = connectLayers(lg, "flip2", "cat1/in2");
lg = connectLayers(lg, "cat1", "flip3");
lg = connectLayers(lg, "gru4", "cat2/in2");
options = trainingOptions('adam', 'Plots', 'training-progress');
net = trainNetwork(XTrain, YTrain, lg, options);
[XTest, YTest] = japaneseVowelsTestData;
YPred = classify(net, XTest);
accuracy = sum(YTest == YPred)/numel(YTest)

义
2022-11-22
why i can't find the FlipLayer in matlab2022a?
  2 个评论
Amanjit Dulai
Amanjit Dulai 2022-11-22
FlipLayer is a custom layer you need to implement. The code is shown below:
classdef FlipLayer < nnet.layer.Layer
methods
function layer = FlipLayer(name)
layer.Name = name;
end
function Y = predict(~, X)
Y = flip(X,3);
end
end
end

请先登录,再进行评论。


义
2022-11-25
If my data is a 1*n power system load data, how should I set this parameter?

song
song 2023-6-28
If my data is a 1*n bearing system load data, how should I set this parameter?

Artem Lensky
Artem Lensky 2023-8-17
编辑:Artem Lensky 2023-8-17
Below is a multiblock BiGRU implementation with layer normalization and dropout layers.
The models takes the following parameters
numFeatures, numBlocks, numHiddenUnits, numResponse, dropoutFactor
and the function constructing BiGRU is implemented as follows:
function lgraph = constructBiGRU(numFeatures,numBlocks,numHiddenUnits,numResponse,dropoutFactor)
arguments
numFeatures = 1,
numBlocks = 1,
numHiddenUnits = 32,
numResponse = 3,
dropoutFactor = 0,
end
layer = [sequenceInputLayer(numFeatures,'Name','sequenceInputLayer')];
lgraph = layerGraph(layer);
outputName = lgraph.Layers(end).Name;
for i = 1:numBlocks
layers = [gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_1_" + i)
concatenationLayer(1, 2, Name="cat_" + i)
dropoutLayer(dropoutFactor, Name="dropout"+ i)
layerNormalizationLayer(Name="layernorm_" + i)];
lgraph = addLayers(lgraph, layers);
layers = [FlipLayer("flip_1_" + i)
gruLayer(numHiddenUnits, OutputMode='sequence',Name="gru_2_" + i)
FlipLayer("flip_2_" + i)];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, outputName, "gru_1_" + i);
lgraph = connectLayers(lgraph, outputName, "flip_1_" + i);
lgraph = connectLayers(lgraph, "flip_2_" + i, "cat_" + i + "/in2");
outputName = "layernorm_" + i;
end
% Remove last dropout layer. In case the network contains a single
% BiGRU block, no dropout layers will be added at all.
lgraph = removeLayers(lgraph, "layernorm_" + i);
layers = [fullyConnectedLayer(numResponse,'Name','fc')
softmaxLayer
classificationLayer];
lgraph = addLayers(lgraph, layers);
lgraph = connectLayers(lgraph, "dropout" + i, "fc");
end

类别

Help CenterFile Exchange 中查找有关 Pattern Recognition and Classification 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by