Error with CNN and LSTM network

9 次查看(过去 30 天)
Good day,
I am attempting to do a combined cnn and lstm network with the following layers:
tempLayers = [
sequenceInputLayer(InputSize,"Name","sequence")
sequenceFoldingLayer("Name","seqfold")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
% Layer 1: 3 filters, stride of 1, length of filter is 102, no padding.
convolution2dLayer([40 1],32,'Stride',1,"Name","conv_1")
batchNormalizationLayer("Name","batchnorm_1")
leakyReluLayer("Name","relu_1")
maxPooling2dLayer([4 1],'Padding',"same","Name","maxpool_1")
dropoutLayer(0.1,"Name","dropout_1")
convolution2dLayer([40 1],32,'Stride',1,"Name","conv_2")
batchNormalizationLayer("Name","batchnorm_2")
leakyReluLayer("Name","relu_2")
maxPooling2dLayer([4 1],'Padding',"same","Name","maxpool_2")
dropoutLayer(0.1,"Name","dropout_2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(128,"Name","lstm_1","OutputMode","last")
lstmLayer(128,"Name","lstm_2","OutputMode","last")
fullyConnectedLayer(1,"Name","fc")
%softmaxLayer("Name","softmaxlayer")
%classificationLayer("Name","classificationoutput")
regressionLayer("Name","regressionoutput")
];
lgraph = addLayers(lgraph,tempLayers);
%% Connect Layer Branches
clear tempLayers;
lgraph = connectLayers(lgraph,"seqfold/out","conv_1");
lgraph = connectLayers(lgraph,"seqfold/miniBatchSize","sequnfold/miniBatchSize");
lgraph = connectLayers(lgraph,"dropout_2","sequnfold/in");
However when i try to train the network using a train input that is a 4d double and output that is a 200 column vector with hr data i receive the following error:
"Error using trainNetwork (line 183)
Invalid training data. For a recurrent layer with output mode 'last', inputs must be cell arrays.
Error in ecng_6700_cw1_hw4_test_codem (line 235)
net = trainNetwork(train_input,estimator_train_output,lgraph,opts);"
I am unsure what the issue is with my data in trying to train it.
  2 个评论
James Lu
James Lu 2022-2-4
have you tried changing the first LSTM layer to
lstmLayer(128,"Name","lstm_1","OutputMode","sequence")
Vinay Kulkarni
Vinay Kulkarni 2023-3-13
Tried this, but getting error as :
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);
Caused by:
Layer 'LSTM1': LSTM layers must have scalar input size, but input size (32×16) was received. Try using a flatten layer before the LSTM layer.
And with addition of flatten layer:
Error using trainNetwork (line 184)
The training sequences are of feature dimension 653956 32 but the input layer expects
sequences of feature dimension 32 16.
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);

请先登录,再进行评论。

回答(1 个)

yanqi liu
yanqi liu 2022-2-8
yes,sir,as James Lu idea,may be use
tempLayers = [
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(128,"Name","lstm_1","OutputMode","sequence")
lstmLayer(128,"Name","lstm_2","OutputMode","sequence")
fullyConnectedLayer(1,"Name","fc")
%softmaxLayer("Name","softmaxlayer")
%classificationLayer("Name","classificationoutput")
regressionLayer("Name","regressionoutput")
];
or make data to cells,such as
[XTrain,YTrain] = japaneseVowelsTrainData;
XTrain
XTrain = 270×1 cell array
{12×20 double} {12×26 double} {12×22 double} {12×20 double} {12×21 double} {12×23 double} {12×22 double} {12×18 double} {12×24 double} {12×15 double} {12×23 double} {12×15 double} {12×17 double} {12×14 double} {12×14 double} {12×15 double}
now we can see the cell data,then you can use origin net layers to try
  1 个评论
Vinay Kulkarni
Vinay Kulkarni 2023-3-13
Tried your above suggestion of adding sequenceunfolding and flattening layers, but still getting errors:
such as
layers=[
sequenceUnfoldingLayer("Name","sequnfold")
flattenLayer("Name","flatten")
lstmLayer(32,"Name","LSTM1","OutputMode","sequence")
Error in Train_Model (line 60)
net =trainNetwork(XTrain,YTest,layers,options);
Caused by:
Network: Missing input layer. The network must have at least one input layer.
Layer 'sequnfold': Unconnected input. Each layer input must be connected to the output of anoth

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by