Effective number of parameters in a neural network
1 次查看(过去 30 天)
显示 更早的评论
Hello ;
I'm training a neural network using the Bayesian approach. In the documentation, I read the following : "One feature of this algorithm is that it provides a measure of how many network parameters (weights and biases) are being effectively used by the network."
But I don't quite understand something : once I know the amount of effective parameters, what can I do with this information? For starter, how come some of the parameters are not used? Why are some weights inactive? Secondly, can knowing that help me prune the network and reduce the amount of neurons, for example? If yes, how? If no, then what is the practical use of that piece of information?
Thanks in advance for your help!
J
0 个评论
采纳的回答
Greg Heath
2013-5-19
TRAINBR automatically chooses the weighting ratio that multiplies the sum of squared weights that is added to the sum of squared errors to form the objective function. The choice depends on the effective number of weights.
I don't recall the formula, however, you should be able to find it in the source code, it's references, or online.
The only way I can see you using it is if you use TRAINLM with the regularization option of mse. In that case the user chooses the ratio. However, I don't know of a good reason to do that instead of using TRAINBR.
Hope this helps.
Thank you for formally accepting my answer.
Greg
0 个评论
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!