Can I tell trainNetwork() to return the net with the lowest validation or training loss, as opposed to the net at the last iteration?

8 次查看(过去 30 天)
Sometimes training diverges and it's wasteful to re-run with a specific stopping epoch. I could use checkpoints and sift through the trainingInfo to find the lowest loss, but that carries costs in memory and time. Also I am using the Experiment Manager and I'd have to add special code to load the correct checkpoint in my custom metric function. Having a training option to return the net that minimizes the loss would allow me to easily compare multiple experiments, where divergence may occur at different training epochs. Also, minimizing the loss is the definition of training, so it just makes sense.
  1 个评论
Nethtrick
Nethtrick 2020-9-22
A similar question:
I have implemented the checkpoint approach, but am also running into the same issue with batch normalization layers not being defined. So the checkpoint approach is NOT a workaround when your network has batch normalization layers.

请先登录,再进行评论。

回答(1 个)

Madhav Thakker
Madhav Thakker 2020-9-25
Hi Nethrick,
As of now, it is not possible to save the network with the least validation error in DAGNetwork.
However, you might want to explore Custom Training Loops -
  1. Custom Training Loops with dlnetwork (More flexible that trainNetwork/DAGNetwork)
  2. Custom Training Loop with Model as Function (Most flexible)
Examples (R2020a) for these features are available here:
In order to use these more flexible tools, it is suggested to get the newest release.
You can also have a look at https://www.mathworks.com/help/deeplearning/ug/customize-output-during-deep-learning-training.html which has an example of defining the output function which stops network training if the best classification accuracy on the validation data does not improve for N network validations in a row.
Hope this helps.

类别

Help CenterFile Exchange 中查找有关 Custom Training Loops 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by