When I should stop training a neural network?

3 次查看(过去 30 天)
I'm working in a neural network with BackPropagation. The network has 6 inputs, 1 hidden layer (6 neurons on that layer) and 1 output. I train the network with algorithms "Levenberg-Marquardt" and "Bayesian Regularization". So, the idea is can "predict" a result but the results are not the right ones according to the table with the historical data.
To stop the training, for the moment, I look the "regression plot", the "Mean squared Error" and "Regression R Values", wich have the "ideal values" but still the results are not accurates and are not even close with datas who "doesn't exist" in the table with the historical data.
What graphic should I look at to know the network is not overfitting or is correctly trained?

采纳的回答

Greg Heath
Greg Heath 2019-1-26
编辑:Greg Heath 2019-1-26
The danger is OVERTRAINING an OVERFIT NET. There are several approaches.
1. PREVENT OVERFITTING the I-H-O net by having the number of training equations
Ntrneq = Ntrn* I
be no smaller than the number of unknown weights,
Nw = (I+1)*H+(H+1)*O
i.e., Ntrneq >= Nw
2. PREVENT OVERTRAINING by using a reasonable training goal
mse(target-output) <= 0.01 * mse(target-mean(target')' )
3. PREVENT OVERTRAINING by using a validation subset to implement
EARLY STOPPING
4. PREVENT OVERTRAINING by using TRAIN BR to implemet BAYESIAN
REGULARIZATION LEARNING
EARLY STOPPING (3) is automatic with the default TRAINLM.
Typically I try to implement 1-3 via minimizing H in addition to using EARLY STOPPING with the default training algoithm TRAINLM (2).
On rare equations I will use TRAINBR (4) when 1-3 do not yield satisfactory results.
Searching BOTH NEWSGROUP and ANSWERS using
Greg Ntrneq Nw
should yield zillions of examples
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by