dividerand followed by divideind doesn't seem to work?!?!
2 次查看(过去 30 天)
显示 更早的评论
Hi,
When I run the code below, the divideind seems to be totally ignored when training the neural net. It trains using the default dividerand.
Any ideas???
When looking at what tr returns we see:
divideFcn: 'dividerand' (I would expect it to say divideind)!
trainInd: [1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 19 21 22 23 24 25 26 27 28 29 30 31 33 34 35 36 37 39 41]
valInd: []
testInd: [17 18 20 32 38 40]
There should have been fewer in trinInd and more in testInd.
Thank you!
x = -1:0.05:1;
t = sin(2*pi*x) + 0.1*randn(size(x));
[trainT,valT,testT] = dividerand(length(t),.55,.25,.20) % maybe make eval percentage 0 if trainbr, but not necessary since:
% "Validation stops are disabled by default (max_fail = 0) so that
% training can continue until an optimial combination of errors and
% weights are found"?
trainTsave = trainT;
valTsave = valT;
testTsave = testT; % save for use in all the NNs
% for trainbr we combine validation and test and call them all test
[trainInd,valInd,testInd] = divideind(length(t),trainTsave,[],[valTsave,testTsave]) % order of values in testInd is wrong?????????????? But is it a problem???
net = fitnet([20,20],'trainbr'); % [20,10] means 2 hidden layers 1st w size 20 & 2nd w size 10
% help(net.trainFcn) to see the default settings
% disable fitnet from using mapminmax if using my scaling algo?
[net,tr] = train(net,x,t) % trains the NN and tr contains all kinds of metadata
view(net); % sb after train
y = net(x); % generate the predicted output values based on the NN with the inputs
perf = perform(net,y,t);
0 个评论
采纳的回答
Viren Gupta
2019-1-2
You can set the properties of the 'net' here. Example:-
net.divideFcn = 'divideind';
net.divideParam.trainInd = trainInd;
net.divideParam.valInd = valInd;
net.divideParam.testInd = testInd;
Then you can again train the model. You can refer the following link that can help.
更多回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!