- https://www.mathworks.com/help/deeplearning/ug/define-custom-deep-learning-layer.html
- https://www.mathworks.com/help/deeplearning/ug/define-custom-deep-learning-intermediate-layers.html
How can I write my own piecewise defined custom activation function that takes dlarray as input?
2 次查看(过去 30 天)
显示 更早的评论
I'm trying to write my own activation function for preventing my gradients from vanishing.
However I struggle with making the implementation suitable for dlarrays.
When using a "normal" array my code looks like the following and it works just fine:
f = tanh(x);
f(abs(tanh(x))>= 0.99) = x(abs(tanh(x))>= 0.99) * (0.99 / atanh(0.99));
When I use this function in the predict function I get a lot of errors regarding "reshaping" when trying to run the NN.
I tried to write my custom layer according to https://de.mathworks.com/help/deeplearning/ug/define-custom-deep-learning-layer.html
I'm grateful for any help provided!
0 个评论
回答(1 个)
Kausthub
2023-9-7
Hi Carina Haschke,
I understand that you are facing problems while creating a piecewise-defined custom activation function that has an input of type ‘dlarray’. Since you are getting a lot of errors regarding ‘reshaping’ please crosscheck the input and output sizes of the layers. I believe that changing the array to ‘dlarray’ will not introduce any new errors.
I have attached the requested custom activa tion function along with a sample neural network to test the layer. The example works for both the ’normal’ array as well as ‘dlarray’. Please refer to the snippets for more clarifications and change it according to your needs.
You could also refer to these articles for more information:
Hope this helps and clarifies your queries regarding creating custom activation functions with a ‘dlarray’ as input!
0 个评论
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!