Strange behaviour with batch neural network learning
显示 更早的评论
My dataset is too large to learn on at the same time, so I'm using a batch learning scheme like this:
net = fitnet(50,'trainbr');
numBatches = 10
batchSize = round(length(trainTrainingData)/numBatches);
for iBatch = 1 : numBatches*3
[trainingBatch,batchIndices] = datasample(trainingData',batchSize,'Replace',false);
outcomesBatch = trainingOutcomes(batchIndices);
[net,tr] = train(net,trainingBatch',outcomesBatch);
end
Essentially I'm sampling (without replacement) 10% of the data each time to train the neural net with (of course there's no guarantee I'll actually cover all the data by the end of training). I'm confused because I often see that when it begins learning on a new batch that the training and test MSEs will go up from the 0th epoch to the 1st epoch; i.e. the fit of the network before it started training on that new batch was better than after it began training on it.
I'm trying to just update/adjust the weights on each batch, but is it possible they're being overwritten? If so, how do I just tweak the weights from batch to batch without overwritting?
采纳的回答
更多回答(0 个)
类别
在 帮助中心 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!