NEURAL NETWORK reproducibility of results using neural networks without initfcn = 'rands'

2 次查看(过去 30 天)
I don't understand why each time I train my network I obtain different results. I create neural network with newff function.
net = newff(input,target,3);
i set the init funciton equal to 'initzero' as below
net.inputweights{1,1}.initfcn = 'initzero';
net.layerWeights{1,2}.initFcn = 'initzero';
net.biases{1}.initFcn = 'initzero';
net.biases{2}.initFcn = 'initzero';
then I initialize the network and train it
net = init(net);
net = train(net,input,target);
And yet each run gives me different results ! How is that possible ? Where is the random hidden ?
PS: I'm using 'trainbr' as trainFctn
net.trainFcn = 'trainbr';

采纳的回答

Greg Heath
Greg Heath 2012-10-27
The answer is 'dividerand' randomizes the order of the input/target pairs.
Hope this helps.
Thank you for formally accepting my answer.
Greg

更多回答(4 个)

faramarz sa
faramarz sa 2013-10-22
编辑:faramarz sa 2013-10-22
Different Matlab Neural networks toolbox results is because of two reasons: 1-random data division 2-random weight initialization
For different data division problem use function "divideblock" or "divideint" instead of "dividerand" like this:
net.dividefcn='divideblock;
net.divideparam.trainratio=.7;
net.divideparam.valratio=.15;
net.divideparam.testratio=.15;
For random weight initialization problem, It seems (I'm not sure) all Matlab initialization functions ("initzero", "initlay”, "initwb”, “initnw”) are almost random. So you should force this functions produce similar results per call.
RandStream.setGlobalStream (RandStream ('mrg32k3a','Seed', 1234));
And then use one of them:
net.initFcn='initlay';
net.layers{i}.initFcn='initnw';

Greg Heath
Greg Heath 2011-11-26
I don't know.
Avoid the issue by intializing rand before calling newff. Then just accept the resulting default initnw weights automatically provided by newff.
Hope this helps (at least you will get reproducible results!).
Greg

Greg Heath
Greg Heath 2011-12-3
Try printing out the weights just before the call to TRAIN to see if they are different. Don't forget to initialize RAND before calling NEWFF.
Hope this helps.
Greg
  1 个评论
Flo Trentini
Flo Trentini 2011-12-4
Hey Greg, thank you for helping me ^^ you're the only soul out here haha
Ok so i've tried your ideas, but if i understand weel, it means that we don't actually fix the issue neither FIND the reason why results change right?
anyway, unfortunately when I reinitialize the random, i "randomly" am in a case in which results are bad. So I could jut not reinitialize it to the 'zero' state, but instead wait for results to be good, and then register the state of the random generator in order to reinitialize it each time to this special state, and get good results each time I "reinitialize". (I'm not sure I've been very clear, let me know if i'm not) But I think doing this is kind of "cheating" on my model results, and in any case it means I don't understand what's really going on inside my code...
Feel free to have new ideas, I'll try them ^^
Flo

请先登录,再进行评论。


Greg Heath
Greg Heath 2011-12-5
Weight space contains jillions of local minima. Therefore I never expect to get a good solution the first time around. That is why I usually use a double loop over (outer) number of hidden nodes and (inner) multiple random weight initializations (Typically for i=1:10; for j=1:10; ...).
Sometimes I have to repeat this several times. Try searching the newsgroup using
heath newff Ntrials
Hope this helps.
Greg
P.S. How many good solutions of the XOR problem do you get for H = 1:5; Ntrials = 1:10?

类别

Help CenterFile Exchange 中查找有关 Networks 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by