Weight decay parameter and Jacobian matrix of a neural network
2 次查看(过去 30 天)
显示 更早的评论
I want calculate prediction intervals so I have 2 direct questions:
- How can I get the weight decay parameter 'alpha' (mse+alpha*msw) used when using 'trainbr' as a training algorithm?
- How can I get the neural network jacobian matrix (derivatives following weights) calculated during training?
0 个评论
采纳的回答
Greg Heath
2014-2-19
编辑:Greg Heath
2014-2-19
The documentation for trainbr is pretty bad.
help trainbr
doc trainbr
Look at the source code
type trainbr
I am not familiar with it but will take a look when I get time.
Meanwhile, if you make a run, the training record tr, contains 2 parameters
gamk: [1x31 double]
ssX: [1x31 double]
that are involved.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 个评论
更多回答(1 个)
Platon
2014-2-21
1 个评论
Greg Heath
2014-2-21
When using the obsolete msereg and mse with the regularization option, the weight parameters are alpha (specified error weight) and (1-alpha).
However when using trainbr, the weight parameters alpha and beta are calculated each epoch. Haven't decifered the logic yet. Might be faster to search the web.
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!