Can we use 'sequenceI​nputLayer(​inputSize)​' with 'featureInputLayer' in multiple input deep convolutional neural network?

17 次查看(过去 30 天)
I am using a network with multiple input CNN network, where one is sequence input and second one is feature input. The combined datastore was created as follows:
dsX1Train = arrayDatastore(XTrainD);
dsX2Train = arrayDatastore(XTrainf);
dsTTrain = arrayDatastore(XTrainL);
dsTrain = combine(dsX1Train,dsX2Train,dsTTrain);
Here 'XTrainD' is of size 800-by-1 cell where each row consists of 1-by-1-by-800 (single) sequence data. 'XTrainf' is feature of 800-by-1 (single) data and 'XTrainL' is the categorical data for labels of size 800-by-1. During training using trainnet(),
options = trainingOptions('adam',...
'Shuffle','every-epoch',...
'InputDataFormats',{'CBT','BC'},...
'MaxEpochs',50,...
'MiniBatchSize',16,...
'InitialLearnRate',1e-4,...
'Verbose',1,...
'ExecutionEnvironment','cpu',...
'Plots','training-progress');
net = trainnet(dsTrain,layer,"crossentropy",options);
some error is shown as below,
Error using trainnet (line 46)
Error forming mini-batch for network input "input_1". Data interpreted with format "CBT". To specify a different format, use the InputDataFormats option.
Caused by:
Input sequences must be numeric or categorical arrays.
Am I creating data and the datastore in the right way? Is it possible to train multiple input network using trainnet with one input as sequence input layer? I have used Train Network on Image and Feature Data - MATLAB & Simulink - MathWorks for the reference.
Thanking in advance for the help.

采纳的回答

Manikanta Aditya
Manikanta Aditya 2024-7-1
编辑:Manikanta Aditya 2024-7-4
Hi,
The error you’re encountering seems to be related to the format of your input data. It suggests that the input sequences must be numeric or categorical arrays. It’s important to ensure that your sequence data ‘XTrainD’ and feature data ‘XTrainf’ are in the correct format.
Try using as a workaround 'sequenceInputLayer' and 'featureInputLayer' in a multiple-input deep convolutional neural network.
Here is an example workaround script for defining the 'sequenceInputLayer' and 'featureInputLayer':
dsX1Train = arrayDatastore(XTrainD, 'OutputType', 'same'); % For sequence data
dsX2Train = arrayDatastore(XTrainf, 'OutputType', 'same'); % For feature data
dsTTrain = arrayDatastore(XTrainL, 'OutputType', 'same'); % For labels
dsTrain = combine(dsX1Train, dsX2Train, dsTTrain);
layers = [
sequenceInputLayer([1 1 800], 'Name', 'sequence_input')
% Add layers for sequence processing here
fullyConnectedLayer(128, 'Name', 'fc_seq')
featureInputLayer(1, 'Name', 'feature_input')
% Add layers for feature processing here
fullyConnectedLayer(128, 'Name', 'fc_feat')
concatenationLayer(1, 2, 'Name', 'concat')
fullyConnectedLayer(64, 'Name', 'fc1')
reluLayer('Name', 'relu1')
fullyConnectedLayer(numClasses, 'Name', 'fc2')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'classOutput')
];
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph, 'fc_seq', 'concat/in1');
lgraph = connectLayers(lgraph, 'fc_feat', 'concat/in2');
  • Ensure the sequence data format is CBT and the feature data format is BC.
  • Create and combine arrayDatastore objects for sequences, features, and labels.
  • Define the network architecture with appropriate input layers and combine them.
  • Use trainingOptions with InputDataFormats set to {'CBT', 'BC'}.
  • Train the network using trainNetwork.
Refer to know more about:
I hope this helps!

更多回答(0 个)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by