Freez specific weights in Custom Neural Network
7 次查看(过去 30 天)
显示 更早的评论
Hi, I've made a custom neural network with 69 layers, I have 3 inputs and the first Input is either 1 or -1. what I need is that the connection form this Input to different layers is scalled by a constant weight, so that the NN act on the other weights. Thank you for your help ! This is my first time I ask a community on the internet :)
3 个评论
回答(1 个)
Sara Perez
2019-9-12
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Sequence and Numeric Feature Data Workflows 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!