- If your target data is not sequences, you can set the OutputMode of gruLayer to "last" to only output the last hidden state. This should be able to concatenate with the output of "fc_4" along dimension 1.
- If your target data are sequences, you could concatenate the output of "fc_4" to each sequence element of the output of "gru_2".
Size of the input layer is different from the expected input size
8 次查看(过去 30 天)
显示 更早的评论
Hi everyone,
I want to combine a feedforward net with 3 features (3x1) with a RNN with 2 time varying features (each having 252 observations). Say I want to concatenate both networks into a single feedforward layer. No matter how I specify the dimentions in the concatenation layer (4,3,2,1), I always get the error message in the app designer "Size of the input layer is different from the expected input size". I also tried to add another feedforward layer after the GRU layer but nothing worked. The network structure I have set up looks the following way:
%Create Layer Graph
lgraph = layerGraph();
%Add Layer Branches
tempLayers = [
sequenceInputLayer(2,"Name","sequence")
gruLayer(200,"Name","gru_1")
gruLayer(200,"Name","gru_2")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
featureInputLayer(3,"Name","featureinput")
fullyConnectedLayer(128,"Name","fc_1")
fullyConnectedLayer(200,"Name","fc_4")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(4,2,"Name","concat")
fullyConnectedLayer(200,"Name","fc_2")
fullyConnectedLayer(10,"Name","fc_3")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
%Connect Layer Branches
%Connect all the branches of the network to create the network graph.
lgraph = connectLayers(lgraph,"gru_2","concat/in2");
lgraph = connectLayers(lgraph,"fc_4","concat/in1");
%Plot Layers
plot(lgraph);
Any comment or feedback is highly appreciated.
0 个评论
采纳的回答
Ben
2022-6-20
You're trying to concatenate the output of "gru_2" with the output of "fc_4". However "gru_2" outputs sequence data, and "fc_4" doesn't. There are probably 2 things to try depending on your task:
An example of 1.
layers = [sequenceInputLayer(2)
gruLayer(200,OutputMode="last")
concatenationLayer(1,2,"Name","cat")
fullyConnectedLayer(10,"Name","output")];
lgraph = layerGraph(layers);
lgraph = lgraph.addLayers([featureInputLayer(3); fullyConnectedLayer(200,"Name","fc")]);
lgraph = lgraph.connectLayers("fc","cat/in2");
% This is fine for dlnetwork. For trainNetwork you will need an output layer:
analyzeNetwork(lgraph,"TargetUsage","dlnetwork");
For 2. it's a little harder, you need a custom layer to repmat the "fc_4" output over the sequence dimension. The shortest way to do this is probably with functionLayer as follows
concatLayer = functionLayer(@concatToSequence,"Formattable",true,"Name","cat");
layers = [sequenceInputLayer(2)
gruLayer(200)
concatLayer
fullyConnectedLayer(10,"Name","Output")];
lgraph = layerGraph(layers);
lgraph = lgraph.addLayers([featureInputLayer(3); fullyConnectedLayer(200,"Name","fc")]);
lgraph = lgraph.connectLayers("fc","cat/in2");
analyzeNetwork(lgraph,"TargetUsage","dlnetwork");
function z = concatToSequence(x,y)
% assume x is sequence and y is not
% x is CBT and y is CB
y = repmat(y,[1,1,size(x,3)]);
% apply labels to y - i.e. add the T label.
y = dlarray(y,"CBT");
z = cat(1,x,y);
end
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!