Determing learning rate and generalization rate in Deep learning toolbox
2 次查看(过去 30 天)
显示 更早的评论
Hello to all,
I have a given neural network, and I want to see how the learning rate and the generalization rate is changes due to the change of vairables in the architecture of the net such as: activision function, number of layers and the algorithm ( GD vs conjugate GD)
I am using the decumentation: https://www.mathworks.com/help/deeplearning/examples/create-simple-deep-learning-network-for-classification.html
*the only thing I can extract is the accuracy, I don`t understand why we only can force the learning rate to be 0.1 in the example?
*Moreover I want the activision function to be sigmoid and the only close possibility is tanh, how can I add sigmoid as activision function?
*In addition I see only the training in the documintation, how i add the test samples and check the generalization efficency?
*and last question, I see that the only algorithm that relevant is SGDM-stochastic gardient descent with momentum, the advantage that we can choose our mini batch, but I don`t see SCGD-stochastic conjugate gardient descent option, how to add it as my algorithm why I train the net?
For seeing the options that the Deep learning toolbox is giving to me I used the documentations:
0 个评论
回答(1 个)
Mahesh Taparia
2020-1-7
Hi Daniel
You can change the learning rate, by changing the value of 'InitialLearnRate',0.01 parameter specified here.
You can evaluate the performance of the model on the test data using classify command. You can refer the documentation here.
As of now, there are three optimization algorithms implemented which are used more frequently, viz. sgdm, adam and RMS Prop.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!