How to concatenate features from one fullyConnectedLayer in a DNN with inputs being images from one class and features from the second class for classifier training?training?
4 次查看(过去 30 天)
显示 更早的评论
%temp2.m
imageInputSize = [28,28,1];
filterSize = 3;
numFilters = 8;
numClasses = 10;
numFeatures = 50;
layers = [
imageInputLayer(imageInputSize,'Normalization','none','Name','images')
convolution2dLayer(filterSize,numFilters,'Name','conv')
reluLayer('Name','relu')
fullyConnectedLayer(50,'Name','fc1')%1 x 1 x 50 x N
squeezeLayer()%50 x N
concatenationLayer(2,2,'Name','cat')
fullyConnectedLayer(numClasses,'Name','fc2')
softmaxLayer('Name','softmax')
classificationLayer];
lgraph = layerGraph(layers);
featInput = featureInputLayer(numFeatures,Name="features");%3 x N
lgraph = addLayers(lgraph,featInput);
lgraph = connectLayers(lgraph,"features","cat/in2");
numObservations = 100;
fakeImages = randn([imageInputSize,numObservations]);%28 28 1 100
imagesDS = arrayDatastore(fakeImages,IterationDimension=4);
fakeFeatures = randn([numFeatures,numObservations]);%100 x 50
featureDS = arrayDatastore(fakeFeatures,IterationDimension=2);%50x100
fakeTargets = categorical(mod(1:2*numObservations,numClasses));%1x100
targetDS = arrayDatastore(fakeTargets,IterationDimension=2);
ds = combine(imagesDS,featureDS,targetDS);
opts = trainingOptions("adam","MaxEpochs",1,"MiniBatchSize",128);
net=trainNetwork(ds,lgraph,opts);
function layer = squeezeLayer(args)
arguments
args.Name='';
end
layer = functionLayer(@squeezeLayerFcn,"Name",args.Name,"Formattable",true);
end
function x = squeezeLayerFcn(x)
x = squeeze(x);
% Since squeeze will squeeze out some dimensions, we need to relabel x.
% Assumption: x does not have a 'T' dimension.
n = ndims(x);
newdims = [repelem('S',n-2),'CB'];
x = dlarray(x,newdims);
%dims(x)
end
Error in temp2 (line 35)
net=trainNetwork(ds,lgraph,opts);
Caused by:
Layer 'cat': Input size mismatch. Size of input to this layer is different from the expected input size.
Inputs to this layer:
from layer 'layer' (size 50(C) × 1(B))
from layer 'features' (size 50(C) × 1(B))
0 个评论
回答(1 个)
Ranjeet
2023-4-14
Hi Ming,
Assuming that you want to concatenate ‘features’ and output of ‘squeezeLayer’, changing the 1st argument of concatenationLayer((2,2,'Name','cat') to concatenationLayer((1,2,'Name','cat') should solve the issue regarding input size mismatch.
Moreover, you can use analyzeNetwork that does the network analysis and present any error by plotting the network. For your case, you can use analyzeNetwork(lgraph).
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!