Matlab Coder for DeepLearning

2 次查看(过去 30 天)
Yang Liu
Yang Liu 2023-11-21
编辑: Yang Liu 2024-3-14
The DL layers supported by Matlab Coder have been listed and updated contineously in the link below:
Many layers have been supported by Generc C/C++ nowadays.
My question is:
When is it possible for the Matlab Coder to support SequenceFolding Layer & SequenceUnfolding Layer with Generic C/C++? Is there a plan for these two layers? Or they will just be skipped?
  2 个评论
Sergio Matiz Romero
Sergio Matiz Romero 2023-11-21
编辑:Sergio Matiz Romero 2023-11-21
Thank you for reaching out. Does your application require that you insert the folding and unfolding layers explicitly for a particular reason? Notice that you can completely avoid the insertion of these layers if you use dlnetwork instead of DAG networks. For instance, you can construct a convolution + LSTM network as:
layers = [
sequenceInputLayer([8 8 3], 'name','seq')
convolution2dLayer(3,3, 'Name', 'convolution1','Padding','same')
lstmLayer(20,'name','lstm')
fullyConnectedLayer(10,'Name','fc')
];
dlnet = dlnetwork(layers);
and the above network does support generic C/C++ code generation. On the other hand, when using DAG networks, you would need to insert sequence folding/unfolding layers (around convolution), which are not currently supported for generic C/C++ code generation.
Can the network you are working with be expressed as a dlnetwork to avoid the use of the unsupported layers? If so, I would recommend using a dlnetwork since it will soon become the recommended workflow. Otherwise, please let me know more about your use case to be able to further assist you
Yang Liu
Yang Liu 2023-11-22
编辑:Yang Liu 2024-3-14
Dear Sergio,
Thanks a lot for your detailed explaination and suggestion!
My network is actually FC(Fully Connect)+LSTM for classification. My layers are constructed as below (sorry, a bit lenthy). Seveal layers of FCs connect with concadenated LSTMs.
Maybe I'm out-dated about the officially recommended operations (yes, I use the DAG network currently), but can "dlnetwork" also waive the sequence folding/unfoldng layer, when applying to FC network?
You code seems neat and concise!
layers = [ ...
sequenceInputLayer(inputSize,'Name','Tgt_Seq_Input')
sequenceFoldingLayer('Name','SeqFold')
fullyConnectedLayer(32,'Name','FC_Out_32')
batchNormalizationLayer('Name','BN_FC_Out_32')
reluLayer('Name','Relu_FC_Out_32')
fullyConnectedLayer(16,'Name','FC_Out_16')
batchNormalizationLayer('Name','BN_FC_Out_16')
reluLayer('Name','Relu_FC_Out_16')
fullyConnectedLayer(8,'Name','FC_Out_8')
batchNormalizationLayer('Name','BN_FC_Out_8')
reluLayer('Name','Relu_FC_Out_8')
fullyConnectedLayer(4,'Name','FC_Out_4')
batchNormalizationLayer('Name','BN_FC_Out_4')
reluLayer('Name','Relu_FC_Out_4')
sequenceUnfoldingLayer('Name','SeqUnfold')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm1')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm2')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm3')
lstmLayer(numHiddenUnits,'OutputMode','last','Name','lstm4')
fullyConnectedLayer(numClasses, 'Name','FC_Out_2')
softmaxLayer('Name','softmax')
classificationLayer('Name','classification')
];
lgraph = layerGraph(layers);
lgraph_1 = connectLayers(lgraph,'SeqFold/miniBatchSize','SeqUnfold/miniBatchSize');
figure
plot(lgraph_1)
options = trainingOptions('sgdm', ...
'ExecutionEnvironment','cpu', ...
'GradientThreshold',1, ...
'InitialLearnRate',0.5,...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.5, ...
'LearnRateDropPeriod',10, ...
'Momentum',0.15,...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'Shuffle','every-epoch', ...
'ValidationData',{TS_Train_Comb_Cell_Validation,TS_Train_Comb_Cell_Validation_Label}, ...
'ValidationFrequency',300, ...
'Verbose',1, ...
'Plots','training-progress');
% net = trainNetwork(trainingDatastore,lgraph_1,options);
net = trainNetwork(TS_Train_Comb_Cell_Train,TS_Train_Comb_Cell_Train_Label,lgraph_1,options);

请先登录,再进行评论。

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

产品


版本

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by