Integrating a LSTM layer into a NARX network

19 次查看(过去 30 天)
Hi, is it possible to integrate an LSTM layer into this type of network?
obtaining a network like:
Input Layer - NARX - LSTM - Output Layer ?
thanks to anyone who can help me
I attach my current code where I would like to insert the LSTM layer:
_______________________________________________________________________________________
% Solve an Autoregression Problem with External Input with a NARX Neural Network
%
% This script assumes these variables are defined:
%
% NN-IN - input time series.
% NN-TARG - feedback time series.
clear; clc; format long;
IN = readmatrix('NN-IN.xlsx');
TARG = readmatrix('NN-TARG.csv');
X = tonndata(IN,false,false);
T = tonndata(TARG,false,false);
% Choose a Training Function
% 'trainscg' uses less memory. Suitable in low memory situations.
% 'traingdx' Gradient descent with momentum and adaptive learning rate backpropagation
trainFcn = 'trainscg'; % Scaled conjugate gradient backpropagation.
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = [30,10];
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Prepare the Data for Training and Simulation
[x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing
net.divideFcn = 'divideblock';
net.divideMode = 'time';
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
%Tolerance
net.trainParam.max_fail=3;
tic
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
toc
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
% net.performParam = 'normalized';
% net.performFcn = 'mse';
performance = perform(net,t,y);
% View the Network
view(net);
  4 个评论
sam l
sam l 2022-9-28
I am looking to use NARX and LSTM , but yet to figure out it .
I was looking at CNN+RNN and thought if i can be done

请先登录,再进行评论。

回答(1 个)

Conor Daly
Conor Daly 2023-3-31
Hi Girolamo,
You can combine NARX and LSTM architectures within dlnetwork. Note that a NARX network is essentially a 1D convolution over a concatenation of the input sequence x and the time steps t. Here's an example which you can run using the L-BFGS optimizer which was released with R2023a:
%% Get MagLev data.
[x,t] = maglev_dataset;
x = [x{:}];
t = [t{:}];
X1 = dlarray(x(:, 1:end-1), 'CTB');
X2 = dlarray(t(:, 1:end-1), 'CTB');
T = dlarray(t(:, 3:end), 'CTB');
%% Construct NARX-LSTM dlnetwork.
layers = [ sequenceInputLayer(1, Name="xin", MinLength=2)
concatenationLayer(1, 2, Name="cat")
convolution1dLayer(2, 10)
tanhLayer()
lstmLayer(16)
fullyConnectedLayer(1) ];
lg = layerGraph(layers);
lg = addLayers(lg, sequenceInputLayer(1, Name="tin", MinLength=2));
lg = connectLayers(lg, "tin", "cat/in2");
net = dlnetwork(lg);
analyzeNetwork(net, X1, X2)
%% Train using L-BFGS.
maxEpochs = 150;
solverState = [];
lossFcn = @(net)dlfeval(@modelLoss, net, X1, X2, T);
monitor = trainingProgressMonitor;
monitor.Metrics = "TrainingLoss";
monitor.XLabel = "Epoch";
for epoch = 1:maxEpochs
[net, solverState] = lbfgsupdate(net, lossFcn, solverState);
recordMetrics(monitor, epoch, TrainingLoss=solverState.Loss);
end
%% Open-loop inference.
Y = predict(net, X1, X2);
yopen = extractdata(Y(:));
figure;
plot(yopen, '.'), hold on, plot(t(3:end))
%% Model loss function.
function [loss, grad] = modelLoss(net, X1, X2, T)
Y = predict(net, X1, X2);
loss = l2loss(Y, T);
grad = dlgradient(loss, net.Learnables);
end

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

产品


版本

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by