In a NARX, are the weights kept constant from one training process to another with different input data?

1 次查看(过去 30 天)
Hello everybody,
In my project, I have to develop a NARX to predict a time series based on the information obtained from frames of videos. For each video, my input data is a row cell, where each location corresponds to a frame, i.e. one timestep.
I have many videos, but I can only give to the network one video at a time. What i'm doing is like: give one video, train the narx, give the following video, train the narx, etc. Note that all videos have different number of frames.
So, my question is the following: am I proceeding well, since the weights and all the network information are kept from one training process to another? Or am I losing my time, since the weights are always initialized?
Thank you for helping me :)

采纳的回答

Greg Heath
Greg Heath 2017-9-16
You cannot train in batches of different data. The weight updates of the last batch will supercede all previous updates.
Therefore you have to include representative old data with the new data.
Typically, the old data endpoints are more important than the old data central points.
Hope this helps.
Thank you for formally accepting my answer
Greg

更多回答(1 个)

Sarah Mohamed
Sarah Mohamed 2017-9-15
Assuming that a function such as "adapt" is being used to incrementally train the network, it may still possible for the neural network to "forget" earlier trained inputs even when the weights are passed from one training iteration to the next. You might find the following discussion helpful:
Padding the shorter sequences and combining the data may be an alternative approach. From the following document,
it is noted that "...it is required that each sequence be of the same length. If this is not the case, then the shorter sequence inputs and targets should be padded with NaNs, in order to make all sequences the same length. The targets that are assigned values of NaN will be ignored during the calculation of network performance."

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by