Freez specific weights in Custom Neural Network

13 次查看(过去 30 天)
Hi, I've made a custom neural network with 69 layers, I have 3 inputs and the first Input is either 1 or -1. what I need is that the connection form this Input to different layers is scalled by a constant weight, so that the NN act on the other weights. Thank you for your help ! This is my first time I ask a community on the internet :)
  3 个评论
tchedou menou
tchedou menou 2016-11-13
编辑:tchedou menou 2016-11-13
Hi, thank you for your answer !
I'm pretty sure my neural network would behave in a "nice way" when I do this. I explain myself :
I need several layers to be shut down ( output zero ) when my input is -1 and be turned on when my input is 1. using the ReLu as activation function, this is actually possible.
Example : suppose a single neuron with 2 inputs, the second input times its weight is 2, when the first input is 1 and the transition is 3 (this is the weight I want to freeze) the output of the neuron is 5 / when th first input is -1, the output of the neuron is 0. and the neuron is dead ( because of the ReLu )
If you have any idea how to tell a neuron to output zero based on condition, I'm with you. I don't know for the moment any other way to do this.
Thank you :)
tchedou menou
tchedou menou 2016-11-17
Sorry, my last comment is confusing.
in simpler words : I want to prevent Relu layers from dying, there's many solutions to this ( using modified versions of Relu or slow convergence training function ). My Idea is to fix certain weights (or at least give them a range of freedom) so the chance of the layer always outputing 0 is very limited.
thanks

请先登录,再进行评论。

回答(1 个)

Sara Perez
Sara Perez 2019-9-12
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by