Bad classification even after training neural network

1 次查看(过去 30 天)
Even after training the neural network and getting a correct classification of 98.5 percent in the confusion matrix after training. When I test it with sample data its classifying it wrongly. Any reasons for this ?
Here is the code which I ma using for training
rng('default');
load ina.mat
load inb.mat
inputs=mapminmax(ina);
targets=inb;
size(inputs);
p=inputs;
% Create a Pattern Recognition Network hiddenLayerSize = 40; net = patternnet(hiddenLayerSize);
% Choose Input and Output Pre/Post-Processing Functions % For a list of all processing functions type: help nnprocess net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'}; net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Setup Division of Data for Training, Validation, Testing % For a list of all data division functions type: help nndivide net.divideFcn = 'dividerand'; % Divide data randomly net.divideMode = 'sample'; % Divide up every sample net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100; net.divideParam.testRatio = 15/100;
% For help on training function 'trainlm' type: help trainlm % For a list of all training functions type: help nntrain net.trainFcn = 'trainscg'; % Levenberg-Marquardt
% Choose a Performance Function % For a list of all performance functions type: help nnperformance net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions % For a list of all plot functions type: help nnplot net.plotFcns = {'plotperform','plottrainstate','ploterrhist', ... 'plotregression', 'plotfit'};
net.trainParam.max_fail = 55;
net.trainParam.min_grad=1e-10;
net.trainParam.show=10;
net.trainParam.lr=0.01;
net.trainParam.epochs=1000;
net.trainParam.goal=0.001;
% Train the Network [net,tr] = train(net,inputs,targets);
% Test the Network outputs = net(inputs); errors = gsubtract(targets,outputs); performance = perform(net,targets,outputs)
% Recalculate Training, Validation and Test Performance trainTargets = targets .* tr.trainMask{1}; valTargets = targets .* tr.valMask{1}; testTargets = targets .* tr.testMask{1}; trainPerformance = perform(net,trainTargets,outputs) valPerformance = perform(net,valTargets,outputs) testPerformance = perform(net,testTargets,outputs)
disp('after training')
y1 = sim(net,p);
y1=abs(y1);
y1=round(y1)
disp(y1)
save E:\final_new\final\net;
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
figure, plotconfusion(targets,outputs);
%figure, ploterrhist(errors)

采纳的回答

Greg Heath
Greg Heath 2014-4-20
Two possibilities
1. The training data does not adequately characterize the total data set.
2. The net is overfit with too many weights AND the net is overtrained past the point where it
trades the ability to work well on nontraining data to further the decrease in training error.
Are you using validation stopping?
Are your training, validation and test sets randomly chosen?
What are the data division ratios?
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 个评论
dream
dream 2014-4-20
I am using performance stopping . Check out my edited question
Greg Heath
Greg Heath 2014-4-23
编辑:Greg Heath 2014-4-28
You are using validation stopping. I have never heard of "performance stopping" (However, I have only been designing NNs for 35 years).
Oh, I see: max_fail = 55.
I don't know size(input) and size(target). However, I'm quite sure that H=17 hidden layer nodes is overkill.
Whoops Now you have H = 40? Bad move, try going the other way.
You should try to minimize H and for every candidate value design ~10 candidates with different initial weights. Use the validation set performance to rank the nets. Obtain unbiased estimates of unseen data error rates using the test set performance.
I have posted many, many designs in the NEWSGROUP and ANSWERS. Search on
greg patternnet Ntrials

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by