Why is Bayesian regularization backpropagation (Neural Network Toolbox) so very very slow?
3 次查看(过去 30 天)
显示 更早的评论
Empirically I've found with a challenging pattern recognition problem I'm working on, that Bayesian regularization backpropagation (trainbr) outperforms more standard tools such as trainlm, trainscg and trainrp by quite a bit. But, it takes an extraordinarily longer time to compute. In its original formulation (MacKay 1992), Bayesian regularization required calculation of the Hessian matrix, which is very computationally demanding, and would account for the long time. However, in Foresee 1997 (both works cited in Matlab doc for trainbr), an alternative was developed that claims to reduce the computational challenge to be similar to e.g. trainlm. This latter work is cited in the documentation, but is it implemented? Can I find a library somewhere that implements it? I'm pretty confident that trainbr as implemented in the Neural Network Toolbox requires calculation of the Hessian, because it refuses to run on a GPU, identifying lack of support for inversion of the (related) Jacobian as the reason. But, I'd be happy to be educated on that.
2 个评论
Andrew Diamond
2018-1-24
did you ever ping matlab support on this? As they say, "Inquiring minds want to know."
回答(2 个)
Mustafa Sobhy
2019-8-22
Because It requires the computation of the Hessian matrix of the performance index.
Source: Gauss-Newton Approximation to Baysian Learning.
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!