Adding dlarray support / differentiability to custom functions
2 次查看(过去 30 天)
显示 更早的评论
I am training deep neural networks with custom training loops, by defining a model/network as a function passed to dlfeval(). The model uses functions that do not currently have dlarray support. Therefore I cannot backprop into them to compute gradients for optimization (for example, matrix determinant and inverses).
The only way I have found to add support for these functions is to wrap them in a custom layer (nnet.layer.Layer) with a custom backward() function. However I can only use this custom layer after embedding it within a layerGraph and dlnetwork, together with input and output layers. This seems very cumbersome. Is there a straightforward option?
This may be a feature request: please let us define new functions with dlarray support. The official list is growing slowly (https://www.mathworks.com/help/deeplearning/ug/list-of-functions-with-dlarray-support.html) but the community could very quickly expand it.
0 个评论
回答(1 个)
Shivansh
2023-10-23
Hi Damien,
I understand that you are using some functions in your deep neural network model, and they are not supported currently by dlarray.
I did try to reproduce it at my end and wrapping of functions in a custom layer with a custom backward function() looks like the only possible way. It seems like a valid feature request.
I expect that MathWorks is informed about the issue and will develop custom functions support for dlarray in the future releases.
Hope it helps!
1 个评论
Miguel Rivas Costa
2024-1-24
Hi, maybe a bit late but. I would like to know how to wrap the functions which are not currently supported by dlarrays in a custom layer. That custom layer should be at the end of the Network?
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!