Why do I have the same validation accuracy for every epoch?

5 次查看(过去 30 天)
I developed a CNN for ECG arrhythmia classification and when I train the model I obtain the same validation accuracy for all of the 50 epoch. Can you please tell me what is wrong? I tried to modify the parameters, also the structure, but the validation accuracy is unchanged (80.1%).
Labels=cnnLabels(labels); %Divide and add LABELS in classes from 1 to 5
[heartbeats, ~, ~] = featureNormalize(heartbeats); % Normalization
%% Spliting the data in training and testing sets
PercentNumFiles =round(0.90*length(heartbeats)); %90% of the files for training and 10% for testing
trainingPercent=round(0.9*PercentNumFiles);%90% of the training set is used for training and the other 10 for training validation
randomNum(:,1)=randperm(length(heartbeats)); %random selection of the files
Xtrain=heartbeats(randomNum(1:trainingPercent),:);
Xvalidation=heartbeats(randomNum(trainingPercent+1:PercentNumFiles),:);
Xtest=heartbeats(randomNum(PercentNumFiles+1:end),:);
Ytrain=Labels(randomNum(1:trainingPercent));
Yvalidation=Labels(randomNum(trainingPercent+1:PercentNumFiles));
Ytest=Labels(randomNum(PercentNumFiles+1:end));
%% CNN
clear ECG fs ind before after anntype cleanECG PercentNumFiles randomNum
height = 1;
width = 300;
channels = 1;
Xtrain = reshape(Xtrain,[height, width, channels, length(Xtrain)]);
Xvalidation=reshape(Xvalidation,[height, width, channels, length(Xvalidation)]);
Xtest = reshape(Xtest,[height, width, channels, length(Xtest)]);
Ytrain=categorical(Ytrain);
Yvalidation=categorical(Yvalidation);
Ytest=categorical(Ytest);
%% CNN construction
% classes = [1 2 3 4 5];
% classWeights = [0.1 0.7 0.6 0.9 0.3];
classWeights = 1./countcats(Ytrain);
classWeights = classWeights'/mean(classWeights);
Layers=[
imageInputLayer([height,width,channels]); %'DataAugmentation', 'none'); %'Normalization', 'none');
convolution2dLayer([1 3], 256,'stride',[1 1], 'padding','same'); %Filter window size = [1 5], No of filters = 64, stride = [1 1];
convolution2dLayer([1 3], 256,'stride',[1 1], 'padding','same');
reluLayer();
dropoutLayer();
maxPooling2dLayer([1 2],'stride',[1 2]); %PoolSize = [1 2], Stride = [1 1]
convolution2dLayer([1 3], 128,'stride',[1 1], 'padding','same');
reluLayer();
convolution2dLayer([1 3], 128,'stride',[1 1], 'padding','same');
reluLayer();
convolution2dLayer([1 3], 64,'stride',[1 1], 'padding','same');
reluLayer();
dropoutLayer();
maxPooling2dLayer([1 2],'stride',[1 2]); %PoolSize = [1 2], Stride = [1 1]
fullyConnectedLayer(256);
dropoutLayer();
fullyConnectedLayer(128);
fullyConnectedLayer(5); %Reduce to five output classes
softmaxLayer();
classificationLayer();
];
%% Options of training
options = trainingOptions('sgdm','InitialLearnRate',0.001,'MaxEpochs',50, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.1, ...
'LearnRateDropPeriod', 3,...
'L2Regularizatio',1.0000e-04, ...
'MiniBatchSize', 60,...
'ValidationData',{Xvalidation, Yvalidation},...
'Plots','training-progress');
convnet = trainNetwork(Xtrain,Ytrain,Layers,options);

采纳的回答

Aditya Patil
Aditya Patil 2021-5-10
As per my understanding, the data you have in unidimensional, and is time variant.
It might be a better option to consider this as a sequence classification problem. Have a look at this example for classifying ECG using LSTMs. There are also video tutorials for the same.
For the current model, it might be predicting the largest class for all the observations. You can check this by looking at the confusion matrix or the outputs.
  1 个评论
Ioana Cretu
Ioana Cretu 2021-5-18
Thank you! I really wanted to use 1D CNN or at least to combine it with LSTMs. I modified the code so I can use the CNN features for LSTM but I have another error
Layers=[
sequenceInputLayer(inputSize,'Normalization', 'zscore', 'Name','input');
sequenceFoldingLayer('Name','fold')
convolution2dLayer([1 7], 16,'stride',[1 1], 'padding','same','Name','conv1')
batchNormalizationLayer('Name','bn1')
maxPooling2dLayer([1 2],'stride',[1 2],'Name','mpool1')
convolution2dLayer([1 7], 32,'stride',[1 1], 'padding','same','Name','conv2')
batchNormalizationLayer('Name','bn2')
reluLayer('Name','relu1')
maxPooling2dLayer([1 2],'stride',[1 2],'Name','mpool2')
convolution2dLayer([1 5], 64,'stride',[1 1], 'padding','same','Name','conv3')
batchNormalizationLayer('Name','bn3')
reluLayer('Name','relu2')
convolution2dLayer([1 5], 128,'stride',[1 1], 'padding','same','Name','conv4')
batchNormalizationLayer('Name','bn4')
reluLayer('Name','relu3')
convolution2dLayer([1 3], 256,'stride',[1 1], 'padding','same','Name','conv5')
batchNormalizationLayer('Name','bn5')
reluLayer('Name','relu4')
maxPooling2dLayer([1 2],'stride',[1 2],'Name','mpool3')
convolution2dLayer([1 3], 512,'stride',[1 1], 'padding','same','Name','conv6')
batchNormalizationLayer('Name','bn6')
reluLayer('Name','relu5')
maxPooling2dLayer([1 2],'stride',[1 2],'Name','mpool4')
% dropoutLayer('Name','dropout')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
bilstmLayer(200,'Name','lstm')
reluLayer('Name','relu6')
fullyConnectedLayer(256,'Name','fc1')
reluLayer('Name','relu7')
fullyConnectedLayer(128,'Name','fc2')
reluLayer('Name','relu8')
fullyConnectedLayer(5,'Name','fc3')
softmaxLayer('Name','softmax')
classificationLayer('Name','classification')
];
lgraph = layerGraph(Layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
When I run this
convnet = trainNetwork(Xtrain,Ytrain,Layers,options);
I have this error
Error using trainNetwork (line 170)
Invalid network.
Caused by:
Layer 'fold': Unconnected output. Each layer output must be connected to the input of another layer.
Detected unconnected outputs:
output 'miniBatchSize'
Layer 'unfold': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'miniBatchSize'
Can you please give me an idea? What is the cause? I connected them using lgraph

请先登录,再进行评论。

更多回答(0 个)

产品


版本

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by