How to "retrain" Neural Networks?

1 次查看(过去 30 天)
Daniel
Daniel 2011-11-22
Hi,
I have an algorithm to forecast time series relying on observed values of the time series in the past. The algorithm is based on several so called experts and based on their performance in the past a convex combination of the experts is used for final prediction.
In the case of Neural Networks, one of the parameters of the experts is the amount of neurons h used. Simplified I have a loop
for i=1:n
net = newfit(inputs(:,1:i),targets(1:i),h);
net.divideParam.trainRatio=75/100;
net.divideParam.valRatio=25/100;
net.trainParam.showWindow=false;
net=train(net,inputs(:,1:i),targets(1:i));
estimate=sim(net,x_eval);
...
end
So in every iteration the network trained only difers a little bit, because only one pair of (inputs,targets) is added for training, the rest remains unchanged. Is there a way to use this fact and accelerate the whole process, like "retraining"?
Thanks
Daniel

回答(1 个)

Greg Heath
Greg Heath 2011-11-23
Everytime you call newfit you create a new network with random initial weights. Therefore, try something like
net = newfit(inputs,targets,h);
net.divideParam.trainRatio=75/100;
net.divideParam.valRatio=25/100;
net.trainParam.showWindow=false;
for i=1:n
net = train(net,inputs(:,1:i),targets(1:i));
estimate=sim(net,x_eval);
...
end
Hope this helps.
Greg
  1 个评论
Daniel
Daniel 2011-11-23
Hi Greg,
this should at least save some time for initiating, thank you. But the training process itself probably consumes a lot more time..
Daniel

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by