Deep Learning Toolbox - fullyConnectedLayer output dimension

2 次查看(过去 30 天)
Problem
I am new to the Deep Learning Toolbox and I am trying to create a custom layer.
This layer of mine is supposed to come after a fully connected layer but I seem to misunderstand the expected output dimension of such layer.
As I thought, if I create a fullyConnectedLayer with output dimension of, say, 3:
fullyConnectedLayer(3)
The output should be a vector of size [3 1] (or more accuratley, a cell with this vector in first position).
Example
To clarify my problem lets consider a simple network. My train data is trainX (of size [1000 12], 1000 observations of 12-element vectors) and train "tags" are trainS (of size [1000 6], 1000 observations of 6-element vectors). The test data is dataX and tags are dataS (same dimensions as the train data).
The network is defined as follows:
%% Prepare Data
VALIDATION_PERCENT = 0.1;
load('data.mat');
validationX = trainX(1:floor(VALIDATION_PERCENT * length(trainX)), :);
validationS = trainS(1:floor(VALIDATION_PERCENT * length(trainS)), :);
trainX = trainX(floor(VALIDATION_PERCENT * length(trainX))+1:end, :);
trainS = trainS(floor(VALIDATION_PERCENT * length(trainS))+1:end, :);
%% Define Network
layers = [ ...
sequenceInputLayer(12)
fullyConnectedLayer(3)
myLayer()
regressionLayer
];
options = trainingOptions('sgdm', ...
'ValidationData', {num2cell(validationX', 1), validationS});
%% Train Network
[trainedNet, traininfo] = trainNetwork(num2cell(trainX', 1), trainS, layers, options);
Were myLayer is a custom layer that does nothing but printing the dimension of its input:
classdef myLayer < nnet.layer.Layer
methods
function varargout = predict(~, varargin)
disp(size(varargin{:}))
varargout = varargin;
end
function dLdX = backward(~, ~, ~, dLdZ, ~)
dLdX = dLdZ;
end
end
end
The Command Window, as I ran the script above, shows:
3 3
3 1
3 1
3 3
3 5
And right after that I get the following error (mainNetwork is the name of the neural network script):
Error using trainNetwork (line 150)
Invalid training data. If all recurrent layers have output mode 'sequence', then regression
responses must be a cell array of numeric sequences, or a single numeric sequence.
Error in mainNetwork (line 22)
[trainedNet, traininfo] = trainNetwork(num2cell(trainX', 1), trainS, layers, options);
From this, three questions arise:
  1. Why does the output of the fullyConnectedLayer is not exclusivley of size [3 1]?
  2. Why does the output of the fullyConnectedLayer changes its size each iteration?
  3. What does the error mean, any why?
Thank you for reading and for your answer!

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

产品


版本

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by