Add confusion matrix to my cross validated code for LSTM classification

2 次查看(过去 30 天)
As the code below I have used LSTM for classifiction of audio data and added cross validation now I would like to show all the results from cross validation in one confusion matrix, how can I do that?
clear all
close all
TrainRatio=0.8;
ValidationRatio=0.1;
folder='/Users/pooyan/Documents/normal/'; % change this path to your normal data folder
audio_files=dir(fullfile(folder,'*.ogg'));
nfileNum=length(audio_files);
nfileNum=10
normal=[];
for i = 1:nfileNum
normal_name = [folder audio_files(i).name];
normal(i,:) = audioread(normal_name);
end
normal=normal';
nLabels = repelem(categorical("normal"),nfileNum,1);
folder='/Users/pooyan/Documents/anomaly/'; % change this path to your anomaly data folder
audio_files=dir(fullfile(folder,'*.ogg'));
afileNum=length(audio_files);
anomaly=[];
for i = 1:afileNum
anomaly_name = [folder audio_files(i).name];
anomaly(i,:) = audioread(anomaly_name);
end
anomaly=anomaly';
aLabels = repelem(categorical("anomaly"),afileNum,1);
% randomize the inputs if necessary
%normal=normal(:,randperm(nfileNum, nfileNum));
%anomaly=anomaly(:,randperm(afileNum, afileNum));
AllData = [normal anomaly];
Labels=[nLabels; aLabels];
% K indicates K-fold cross validation
K=2;
cv = cvpartition(Labels,'KFold',K);
% nTrainNum = round(nfileNum*TrainRatio*0.1);
% aTrainNum = round(afileNum*TrainRatio*0.1);
% nValidationNum = round(nfileNum*ValidationRatio*0.1);
% aValidationNum = round(afileNum*ValidationRatio*0.1);
for i=1:K
audioTest = AllData(:, cv.test(i));
labelsTest = Labels(cv.test(i));
audioTrainValidation = AllData(:, ~cv.test(i));
labelsTrainValidation = Labels(~cv.test(i));
% Vp: 10% from training dataset used for validation;
Vp=0.1;
TVL=length(labelsTrainValidation);
ValidationIndex = randperm(TVL, floor(TVL*Vp));
TrainIndex=1:TVL;
TrainIndex(ValidationIndex)=[];
audioTrain = audioTrainValidation(:, TrainIndex);
labelsTrain = labelsTrainValidation(TrainIndex);
audioValidation = audioTrainValidation(:, ValidationIndex);
labelsValidation = labelsTrainValidation(ValidationIndex);
% audioTrain = [normal(:,((i-1)*nTrainNum)+1:i*nTrainNum),anomaly(:,((i-1)*aTrainNum)+1:i*aTrainNum)];
% labelsTrain = [nLabels(((i-1)*nTrainNum)+1:i*nTrainNum);aLabels(((i-1)*aTrainNum)+1:i*aTrainNum)];
%
% audioValidation = [normal(:,i*(nTrainNum+1:nTrainNum+nValidationNum)),anomaly(:,i*(aTrainNum+1:aTrainNum+aValidationNum))];
% labelsValidation = [nLabels(i*(nTrainNum+1):i*(nTrainNum+nValidationNum));aLabels(i*(aTrainNum+1:aTrainNum+aValidationNum))];
%
% audioTest = [normal(:,i*(nTrainNum+nValidationNum+1):end),anomaly(:,i*(aTrainNum+aValidationNum+1):end)];
% labelsTest = [nLabels(i*(nTrainNum+nValidationNum+1):end); aLabels(i*(aTrainNum+aValidationNum+1):end)];
fs=44100;
% Create an audioFeatureExtractor object
%to extract the centroid and slope of the mel spectrum over time.
aFE = audioFeatureExtractor("SampleRate",fs, ... %Fs
"SpectralDescriptorInput","melSpectrum", ...
"spectralCentroid",true, ...
"spectralSlope",true);
featuresTrain = extract(aFE,audioTrain);
[numHopsPerSequence,numFeatures,numSignals] = size(featuresTrain);
numHopsPerSequence;
numFeatures;
numSignals;
%treat the extracted features as sequences and use a
%sequenceInputLayer as the first layer of your deep learning model.
featuresTrain = permute(featuresTrain,[2,1,3]); %permute switching dimensions in array
featuresTrain = squeeze(num2cell(featuresTrain,[1,2]));%remove dimensions
numSignals = numel(featuresTrain); %number of signals of normal and anomalies
[numFeatures,numHopsPerSequence] = size(featuresTrain{1});
%Extract the validation features.
featuresValidation = extract(aFE,audioValidation);
featuresValidation = permute(featuresValidation,[2,1,3]);
featuresValidation = squeeze(num2cell(featuresValidation,[1,2]));
%Define the network architecture.
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(50,"OutputMode","last")
fullyConnectedLayer(numel(unique(labelsTrain))) %%labelTrain=audio
softmaxLayer
classificationLayer];
%To define the training options
options = trainingOptions("adam", ...
"Shuffle","every-epoch", ...
"ValidationData",{featuresValidation,labelsValidation}, ... %%labelValidatin=audioValidation
"Plots","training-progress", ...
"Verbose",false);
%To train the network
net = trainNetwork(featuresTrain,labelsTrain,layers,options);
%Test the network %10 preccent
%classify(net,permute(extract(aFE,audioTest),[2 257 35]))
TestFeature=extract(aFE, audioTest);
for i=1:size(TestFeature, 3)
TestFeatureIn = TestFeature(:,:,i)';
classify(net,TestFeatureIn)
predict(i) = classify(net,TestFeatureIn)
%labelsPred = categorical(classify(net,TestFeatureIn))
end
%Confusion Matrix Chart
%plotconfusion(labelsTest,predict')
C = confusionmat(labelsTest,predict')
confusionchart(labelsTest,predict')
end

回答(1 个)

Divya Gaddipati
Divya Gaddipati 2020-12-31
Hi,
You can accumulate results at the end of loop.
catLabels = [catLabels; labelsTest];
catPredictions = [catPredictions; predict'];
Then, outside the loop, you can calculate the confusion matrix
C = confusionmat(catLabels,catPredictions);
confusionchart(catLabels,catPredictions);

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by