Is selfAttentionLayer in MATLAB2023a only valid for one-dimensional data?
3 次查看(过去 30 天)
显示 更早的评论
When I try to combine selfAttentionLayer with ResNet101, it says that 'The input data contains at most one spatial dimension'. It doesn't work even if I put this layer between the 'pool5' and 'fc', which makes the input to selfAttentionLayer 1 times 1 spatially. According to the example on this page, it seems that selfAttentionLayer is only valid for one-dimensional data. What should I do if a want to use it in an image classification task?
THANK YOU so much for your reply! I am new to deep learning and I really need your help!
1 个评论
bin
2023-5-13
I'm also wondering about this question, the current version doesn't seem to provide layers to convert images and sequences
回答(1 个)
Gayathri
2025-6-13
Yes, as you identified "selfAttentionLayer" operates on one dimensional data. So when you have to use the "selfAttentionLayer" for a classification task you can use a "flattenLayer" after a "maxPooling2dLayer" to bring the data to the required format. Please refer to the code below for better understanding. This example uses "DigitDataset" in MATLAB.
% load digit dataset
digitDatasetPath = fullfile(matlabroot, 'toolbox', 'nnet', 'nndemos', 'nndatasets', 'DigitDataset');
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders', true, 'LabelSource', 'foldernames');
[imdsTrain, imdsValidation] = splitEachLabel(imds, 0.7, 'randomized');
% define network architecture
layers = [
imageInputLayer([28 28 1], 'Name', 'input')
convolution2dLayer(3, 32, 'Padding', 'same', 'Name', 'conv1')
batchNormalizationLayer('Name', 'bn1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1')
convolution2dLayer(3, 64, 'Padding', 'same', 'Name', 'conv2')
batchNormalizationLayer('Name', 'bn2')
reluLayer('Name', 'relu2')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool2')
flattenLayer('Name', 'flatten')
selfAttentionLayer(8, 64, 'Name', 'self_attention')
fullyConnectedLayer(10, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'output')]
% set training options
options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.01, ...
'MaxEpochs', 5, ...
'Shuffle', 'every-epoch', ...
'ValidationData', imdsValidation, ...
'ValidationFrequency', 30, ...
'Verbose', false, ...
'Plots', 'training-progress')
% training the network
net = trainNetwork(imdsTrain, layers, options);
With this code you will be able to start the training without any errors as seen in the screenshot below.

For more information, please refer to the following documentation links.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!