How to use LSTM to solve seq2seq problem in MATLAB?

3 次查看(过去 30 天)
I'm struggling with a seq2seq problem. That is: input 200 past values and use LSTM network to predict next 10 values at one time(winthout using closed loop forecasting).And I tried writing the code as shown below. But it will report an error: Invalid training data. For regression tasks, responses must be a vector, a matrix, or a 4-D array of real numeric responses. Responses must not contain NaNs.
inputSize = 5; % Number of input features
numTimeSteps = 200; % Number of time steps
numSamples = 100; % Number of samples
% Create random input data, a cell array of size [5, 200]
X = cell(numSamples, 1);
for i = 1:numSamples
X{i} = rand(inputSize, numTimeSteps);
end
outputSize = 1; % Number of output features for each time step
outputTimeSteps = 10; % Number of output time steps
numSamples = 100; % Number of samples
% Create random output data, a cell array of size [1, 10]
Y = cell(numSamples, 1);
for i = 1:numSamples
Y{i} = rand(outputSize, outputTimeSteps);
end
numHiddenUnits1 = 128; % Number of hidden units in the first LSTM layer
numHiddenUnits2 = 64; % Number of hidden units in the second LSTM layer
outputSize = 1; % Number of output features
layers = [
sequenceInputLayer(inputSize) % Input layer, number of input features is 5
lstmLayer(numHiddenUnits1, 'OutputMode', 'last') % First LSTM layer, outputs the last time step
fullyConnectedLayer(numHiddenUnits2) % Fully connected layer, transforms the input dimensions for the second LSTM layer
functionLayer(@(X) repmat(X, [1, outputTimeSteps]), 'Name', 'replicate10') % Expand the output of the last time step to 10 time steps
lstmLayer(numHiddenUnits2, 'OutputMode', 'sequence') % Second LSTM layer, outputs a sequence of 10 time steps
fullyConnectedLayer(outputSize) % Fully connected layer, number of output features is 1
regressionLayer]; % Regression layer, used for regression tasks
% Training options
options = trainingOptions('adam', ...
'MaxEpochs', 50, ...
'MiniBatchSize', 32, ...
'Shuffle', 'every-epoch', ...
'Verbose', false);
% Train the network
net = trainNetwork(X, Y, layers, options);
Now I have two questions:
(1) What data format should be used for the input and output of the model during training? Cell arrays or 3D arrays?
(2) How to control the time expansion steps of the input and output for each layer (input layer, LSTM layer, fully connected layer) in an LSTM network?

回答(1 个)

Subhajyoti
Subhajyoti 2024-9-9
Hi @YP,
In Deep Learning models, ‘cell-arrays’ are used to manage input and output data.
You can also use ‘dlarray’ object in MATLAB for handling data in DL tasks, since it is well-integrated with MATLAB's Deep Learning Toolbox, making it easier to work with complex data structures. It stores data with optional data format labels for custom training loops and enables functions to compute and use derivatives through automatic differentiation.
You can control the time-steps at each layer using the time dimension – denoted as “T” in the ‘fmt’ (Data Format) input argument for ‘dlarray()’.
Refer to the following MathWorks documentation links to learn more about ‘dlarray’ in MATLAB:
To address the error message indicating the presence of ‘NaN’s in the dataset, check for these ‘NaN’s before training the network. You can use the following snippet in your code to check for ‘NaN’s in the dataset:
inputHasNaNs = any(isnan(cell2mat(X)));
if inputHasNaNs
error('Input data contains NaNs.');
end
The above snippet throws an error if ‘NaN’s are detected, ensuring that the network training process only begins after the data is cleaned.
Additionally, you can refer to following resource to know more about ‘Sequence-to-Sequence Regression Using Deep Learning’ in MATLAB:
  2 个评论
Subhajyoti
Subhajyoti 2024-9-13
I'm glad to hear that the solution helped you!
You can accept the answer if you feel it will be helpful for others also.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Build Deep Neural Networks 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by