Similar to what I am trying to do with a NarxNet, I am also using a Fitnet to see how far I can have a prediction with a certain maximum error. The script I wrote loads a file with data with over 3500 points, from here I start training the network using only the corresponding to about 4 years of data and do about 10 trials. Then I test the network using values outside the training sample to compare its output with the targets I have.
After running the 10 trials, the training sample is increased by 1 year and trained again. This is done until near the end of my data (about 42 years so 42 cycles).
From my experiments with the sinus function, I don't expect this model to give very good results, but it is required of me to do this.
My question is if there are any suggestions on how I can improve my code in order to reduce overfit and overtraining with too many neurons or layers, etc.
NOTE: My code saves every trial data in one file and in the end of the X trials it saves the prediction errors in another file. This is done in case I require access to any of the random networks later for testing. The ERROR file allows easy verification of the network results for each loop and all trials.
Thanks in advance!
My code is as follows:
trials = 10;
hiddenLayerSize = 6;
endyear = 42;
pred_period = 5;
data_start = 38034;
data_finish = 39489;
S = 1;
N = (data_finish-data_start)/5+1;
prediction_size = pred_period*73;
net = fitnet([hiddenLayerSize,hiddenLayerSize]);
for j = 1:1:endyear
table_errors = zeros(trials,prediction_size);
for i = 1:1:trials
identifier_trial = i;
identifier_cycle = j;
train_inputs = x(S:N)';
train_targets = y(S:N)';
[net,tr] = train(net,train_inputs,train_targets);
train_outputs = net(train_inputs);
train_errors = gsubtract(train_targets,train_outputs);
pred_inputs = x(N+1:N+prediction_size)';
pred_targets = y(N+1:N+prediction_size)';
pred_outputs = net(pred_inputs);
table_errors(i,:) = gsubtract(pred_targets,pred_outputs);
save(['cycle' int2str(j) int2str(i)]);
end
save(['errors' int2str(j)], 'table_errors');
N = N+73;
end