How to use different train function for LSTM model?

15 次查看(过去 30 天)
While using a basic neural network model, there is a parameter 'trainFcn' can be set as 'trainlm'. But I can't find this parameter to be set in LSTM model.

回答(2 个)

Ashu
Ashu 2022-10-7
编辑:Ashu 2022-10-7
I understand that you want to use 'trainlm' as backpropagation algorithm in LSTM model.
Case 1: Multilayer Shallow Neural Networks
[x,t] = simplefit_dataset;
net = fitnet(10,'trainbr'); # here trainFcn is set at trainbr
view(net); # to view untrained network
net = train(net,x,t);
view(net); # to view trained network
Note that creating network using this way, "trainFcn" can be set to these algorithms only - https://www.mathworks.com/help/deeplearning/ref/fitnet.html
Case 2: To train a deep learning network, use trainNetwork.
When you create LSTM, its a recurrent neural network.
Example :
Define Network Architecture
Define the network architecture. Create an LSTM network that consists of an LSTM layer with 200 hidden units, followed by a fully connected layer of size 50 and a dropout layer with dropout probability 0.5.
numResponses = size(YTrain{1},1);
numHiddenUnits = 200;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits,'OutputMode','sequence')
fullyConnectedLayer(50)
dropoutLayer(0.5)
fullyConnectedLayer(numResponses)
regressionLayer];
axEpochs = 60;
miniBatchSize = 20;
options = trainingOptions('adam', ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'InitialLearnRate',0.01, ...
'GradientThreshold',1, ...
'Shuffle','never', ...
'Plots','training-progress',...
'Verbose',0);
Train the Network
Train the network using trainNetwork.
net = trainNetwork(XTrain,YTrain,layers,options);
in "trainingOptions" you can't set "solverName" = "trainlm".
Solver for training network, specified as one of the following:
  • 'sgdm' — Use the stochastic gradient descent with momentum (SGDM) optimizer. You can specify the momentum value using the Momentum training option.
  • 'rmsprop'— Use the RMSProp optimizer. You can specify the decay rate of the squared gradient moving average using the SquaredGradientDecayFactor training option.
  • 'adam'— Use the Adam optimizer. You can specify the decay rates of the gradient and squared gradient moving averages using the GradientDecayFactor and SquaredGradientDecayFactor training options, respectively.
For more information about the different solvers, see Stochastic Gradient Descent.
If the trainingOptions function does not provide the training options that you need for your task, then you can create a custom training loop using automatic differentiation. To learn more, see Define Deep Learning Network for Custom Training Loops.

Krishna
Krishna 2024-11-4,16:40
Hi Hongwei,
The 'trainlm' optimiser was introduced for 'dlnetworks' in R2024b.
To use it please add 'lm' in solverName in trainingOptions and then you can train it using 'trainnet.'
Please go through the following documentation to learn more,
Also please go through the following documentation to learn more regarding how to properly post in MATLAB answer to get quick reply,
Hope this helps.

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by