NN accuracy on test set low
显示 更早的评论
I have implemented a neural network in Matlab R2013a for character recognition. I have used trainbr function for nn training. 80% samples were used for training and the rest for testing. When i plot the confusion matrix, i get 100% accuracy on the training set. But for the test set the accuracy is very low(around 60%). What could be possibly wrong?
采纳的回答
更多回答(3 个)
Greg Heath
2014-3-13
2 个投票
Insufficient info:
How many characters?
How many examples for each character?
What are the dimensions of the input and target matrices?
Are the summary statistics of the training and test subsets sufficiently similar?
How many input, hidden and output nodes?
What values of hidden nodes did you try ?
How many random weight initializations for each value ?
Although trainbr should mitigate the effect of using more hidden nodes than are needed, you still need many trials to establish sufficient confidence intervals.
Hope this helps.
Thank you for formally accepting my answer
Greg
6 个评论
Anitha
2014-3-13
Greg Heath
2014-3-14
Sorry, that does not make sense to me. Consider the following
13 characters of the alphabet A-to-M
234 examples, 18 for each character
All characters are columnized 8x5 images
size(input) = [ 40 234]
size(target) = [ 13 234] % columns of eye(13)
Where did I go wrong?
Anitha
2014-3-15
Greg Heath
2014-3-15
[ I N ] = [ 18 234]
[ O N ] = [ 13 234]
The default trn/val/tst split for trainbr is 0.8/0.0/0.2. The resulting number of training equations is
Ntrn = N - round(0.2*N) % 187
Ntrneq = Ntrn*O % 2431
With H=30 hidden nodes, the number of unknown weights is
Nw = (I+1)*H+(H+1)*O % 973
The ratio is
r = Ntrneq/Nw % ~2.5
Which should be ok for trainbr.
I suggest making multiple designs (20?) in a loop with different mixes of training examples, testing examples and initial weights. For examples of multiple designs in a loop, search using
greg Ntrials
Post your code if you still have problems.
Greg Heath
2014-3-19
Two mistakes
1. No configure statement in the loop
2. Used net instead of bestnet in the last train statement
Greg Heath
2014-3-16
1. Not necessary to specify default process functions.
2. How did you know my birthdate is 4151941 ??
3. You are reusing the same net for each trial without using CONFIGURE.
Therefore, the initial weights of each trial are the final weights of the last trial.
I suspect that if the design results are not monotonically better it is because TRAIN is
using a new trn/tst division.
4. Use configure after the RNG initialization.
5. An alternate approach is to CONTINUALLY save one or all of
a. the best current RNG state
a. the best current net
b. the best current Wb = getwb(net)
6. I think you should do all three at the same time and compare results
Hope this helps.
Thank you for formally accepting my answer
Greg
2 个评论
Anitha
2014-3-17
Greg Heath
2014-3-19
The second train statement contains net instead of bestnet
类别
在 帮助中心 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!