- I agree that gradient descent is vector quantity & points in the direction of maximum change of the cost function.
- The ‘net.trainParam.min_grad’ is a scalar(numeric) quantity. The parameter ‘min_grad’ denotes the minimum magnitude (which is scalar) of gradient descent (which is vector), for which the training of neural network terminates.
- When the magnitude of gradient descent becomes less than ‘min_grad’, the neural network model is said to be optimized (and hence, further training stops).
What is the parameter minimum performance gradient (trainParam.min_grad) of traingd?
12 次查看(过去 30 天)
显示 更早的评论
I use the training function "traingd" to train a shallow neural network:
trainedNet = train(net,X,T)
For the training function "traingd": How is the parameter minimum performance gradient (net.trainParam.min_grad) defined?
As the gradient for the gradient descent is usually a vector, but net.trainParam.min_grad is a scalar value, I am confused.
Is it the change in the performace (loss) between 2 iterations, and if yes: Does it refer to the training, validation or testing errror?
Thanks in advance!
I use MATLAB 2013 and 2015 with the neural network toolbox.
0 个评论
采纳的回答
Rishabh Mishra
2020-9-28
编辑:Rishabh Mishra
2020-9-28
Hi,
Based on your description of the issue, I would state a few points:
For better understanding, refer the following links:
Hope this helps.
2 个评论
更多回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!