The error message is like below.
Error using trainnet (line 46)
Error forming mini-batch of targets for network output "fc_1". Data interpreted with format "BC".
To specify a different format, use the TargetDataFormats option.
Error in main (line 52)
net = trainnet(dsX, net, lossFcn, options);
Caused by:
Index exceeds the number of array elements. Index must not exceed 1.
If you want to use a neural network with multiple input or output, you can use combine to make a CombinedDatastore. This will help you solve the mini-batch issue.
clc; clear;
% load("paddedData2.mat","-mat")
paddedData = cell(10, 3); % Intentionally changed the #observations from 900 to 10 for reproduction purpose
paddedData(:,1) = cellfun(@(x) 405, paddedData(:,1), 'UniformOutput', false);
paddedData(:,2) = cellfun(@(x) 10, paddedData(:,2), 'UniformOutput', false);
paddedData(:,3) = cellfun(@(x) rand(440, 5), paddedData(:,3), 'UniformOutput', false);
XTrain = paddedData(:,3);
YTrain1 = cell2mat(paddedData(:,1));
YTrain2 = cell2mat(paddedData(:,2));
dsX = arrayDatastore(XTrain, 'OutputType', 'same');
dsY1 = arrayDatastore(YTrain1, 'OutputType', 'same');
dsY2 = arrayDatastore(YTrain2, 'OutputType', 'same');
dsTrain = combine(dsX, dsY1, dsY2);
net = dlnetwork;
tempNet = [
sequenceInputLayer([440 5 1],"Name","sequenceinput")
convolution2dLayer([3 3],8,"Name","conv_A1")
batchNormalizationLayer("Name","batchnorm_A1")
reluLayer("Name","relu_A1")
convolution2dLayer([3 3],8,"Name","conv_2")
batchNormalizationLayer("Name","batchnorm_2")
reluLayer("Name","relu_2")
flattenLayer("Name","flatten")
fullyConnectedLayer(100,"Name","fc")
lstmLayer(100,"Name","lstm","OutputMode","last")];
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_1");
net = addLayers(net,tempNet);
tempNet = fullyConnectedLayer(1,"Name","fc_2");
net = addLayers(net,tempNet);
clear tempNet;
net = connectLayers(net,"lstm","fc_1");
net = connectLayers(net,"lstm","fc_2");
net = initialize(net);
options = trainingOptions('adam', ...
'MaxEpochs', 50, ...
'MiniBatchSize', 100, ...
'Shuffle', 'every-epoch', ...
'Plots', 'none');
lossFcn = @(Y1,Y2,dsY1,dsY2) crossentropy(Y1,dsY1) + 0.1*mse(Y2,dsY2);
net = trainnet(dsTrain, net, lossFcn, options);
You can refer to the following example page for detailed explanation on multiple input/output networks.