How are the two layers "sequenceI​nputLayer(​num_channe​ls) & bilstmLaye​r(HU,Outpu​tMode="seq​uence")" connected to each other?

9 次查看(过去 30 天)
Hello, I would like to know how the connection between the sequenceInputLayer and a lstm or bilstmlayer was implemented.
Usually BiLSTM layers have separate weight matrices for each channel of the input.
In a typical BiLSTM network with a sequenceInputLayer and a bilstmLayer, each unit of the bilstmLayer would be connected to each channel of the sequenceInputLayer. This means that each unit of the bilstmLayer receives input from all channels of the sequenceInputLayer. For example, each unit receives the entire number vector per time step.
Is this correct? Please feel free to forward me the documentation on this topic.
Thank you very much and best regards
Chris

回答(2 个)

Debadipto
Debadipto 2024-4-23
Yes, your understanding is generally correct about how a BiLSTM (Bidirectional Long Short-Term Memory) layer connects to a sequence input layer in a neural network architecture. When you use a sequence input layer followed by a BiLSTM layer, the input sequence is fed into both the forward and backward LSTM layers of the BiLSTM. Each unit in these LSTM layers processes the entire input sequence (or the entire set of features at each time step) but in opposite directions; the forward LSTM processes the sequence from start to end, while the backward LSTM processes it from end to start.
Regarding the documentation, the specifics of how these connections are implemented can vary depending on the software or framework you're using. Here are links to the documentation for popular frameworks that might help:
  1 个评论
Christian Holz
Christian Holz 2024-4-26
编辑:Christian Holz 2024-4-26
Thank you very much for your answer.
I agree with your description, but unfortunately I cannot find any reference in the Mathworks documentation. A corresponding description of the implementation would confirm our assumption.

请先登录,再进行评论。


Ieuan Evans
Ieuan Evans 2024-4-26
Hi Christian,
For BiLSTM layers In MATLAB, for each of the input channels and for both the forward and backward parts of the layer, the weight matrices are concatenated into a single matrix. For example, the InputWeights property is a 8*NumHiddenUnits-by-InputSize matrix, where NumHiddenUnits and InputSize are the numbers of hidden units and input channels, respectively.
In this case, the input weight matrix is a concatenation of the eight input weight matrices for the components (gates) in the bidirectional LSTM layer. The eight matrices are concatenated vertically in this order:
  • Input gate (Forward)
  • Forget gate (Forward)
  • Cell candidate (Forward)
  • Output gate (Forward)
  • Input gate (Backward)
  • Forget gate (Backward)
  • Cell candidate (Backward)
  • Output gate (Backward)
For a diagram that shows how data flows through a BiLSTM layer, see https://uk.mathworks.com/help/deeplearning/ug/create-bilstm-function.html

类别

Help CenterFile Exchange 中查找有关 Build Deep Neural Networks 的更多信息

产品


版本

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by