error using trainNetwork and minibatchqueue, on a multi-inputs (sequence and feature) network
16 次查看(过去 30 天)
显示 更早的评论
A multi-inputs(sequence and feature) data is constructed below, the sequence input (XTrain1) is 1347*1 cell and each cell is a 7*31 double; the feature input (XTrain2) is 1347*80 double; the output is 1347*1 categorical (only the data structure matters, so the details are replaced with random numbers).
% Clear variables and close figures
clear; clc; close all;
% Generate example data (simulating real data)
N = 1347; % Number of samples
T = 31; % Time steps
F_time = 7; % Number of time series features
F_static = 80; % Number of static features
% Time series input (1347 samples, each sample is 7*31)
XTrain1 = cell(N, 1);
for i = 1:N
XTrain1{i} = rand(T, F_time)'; % Each sample is a 7*31 matrix
end
% Static feature input (1347 samples, each sample has 80 dimensions)
XTrain2 = rand(N, F_static); % 1347x80 matrix
% Classification labels (assuming binary classification)
YTrain = categorical(randi([1, 2], N, 1)); % Binary classification problem
% Convert XTrain1 (time series data) to arrayDatastore
ds1 = arrayDatastore(XTrain1, 'OutputType', 'cell'); % Time series data in cell format
% Convert XTrain2 (static feature data) to arrayDatastore
ds2 = arrayDatastore(XTrain2, 'OutputType', 'cell'); % Static feature data as numerical array format
% Convert YTrain (labels) to arrayDatastore
dsY = arrayDatastore(YTrain, 'OutputType', 'cell'); % Labels in categorical format
% Create a combined datastore, combining multiple inputs into one multi-input data stream
trainDs = combine(ds1, ds2, dsY);
the network is constructed below
%% Define the network
lgraph = layerGraph();
tempLayers = [
sequenceInputLayer(7,"Name","input","Normalization","zerocenter")
lstmLayer(128,"Name","lstm","OutputMode","last")
dropoutLayer(0.5,"Name","dropout")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
featureInputLayer(8,"Name","featureinput")
fullyConnectedLayer(64,"Name","fc_1")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(1,2,"Name","concat")
fullyConnectedLayer(64,"Name","fc")
fullyConnectedLayer(2,"Name","fc_2")
softmaxLayer("Name","softmax")
classificationLayer("Name","classification");
];
lgraph = addLayers(lgraph,tempLayers);
lgraph = connectLayers(lgraph,"dropout","concat/in1");
lgraph = connectLayers(lgraph,"fc_1","concat/in2");
% Clear temporary variables
clear tempLayers;
% Display the network structure
plot(lgraph)
% Training options
options = trainingOptions("sgdm", ...
MaxEpochs=15, ...
InitialLearnRate=0.01, ...
Plots="training-progress", ...
Verbose=0);
% Train the network using the combined training dataset
net = trainNetwork(trainDs,lgraph,options);
it seems that I have the wrong input format, so how can I solve this bug? Thanks for your viewing my question and I'm glad to receive any guidance from you!
2 个评论
Saurabh
2024-10-30
While reviewing the documentation for 'minibatchqueue', I discovered the definition of 'miniBatchFcn', which is a property of 'minibatchqueue'. It states that:
'Use a function handle to a custom function to preprocess mini-batches for custom training. Doing so is recommended for one-hot encoding classification labels, padding sequence data, calculating average images, and so on. You must specify a custom function if your data consists of cell arrays containing arrays of different sizes.'
From what I understand, there is a need to specify a custom function because the data consists of cell arrays of varying sizes.
The same information can be found here:
I hope this was helpful.
回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
