Is there a way to extract partial derivatives of specific layers in deep learning toolbox?

5 次查看(过去 30 天)
Hi,
I asked this question last year, in which I would like to know if it is possible to extract partial derivatives involved in back propagation, for the parameters of layer so that I can use for other purpose. At that time, the latest MATLAB version is 2019b, and I was told in the above post that it is only possible when the final output y is a scalar, while my desired y can be a vector, matrix or even a tensor (e.g. reconstruction tasks).
Now, is it possible to extract the partial derivatives of layer in 2020b? Thanks.

回答(1 个)

Amanpreetsingh Arora
As mentioned in the answer to the question referred by you, the only way to find partial derivatives of a tensor is by looping over elements and calling "dlgradient" as "dlgradient" only supports scalar input for auto differentiation. However, I understand your concern that this will waste time recomputing overlapping traces. In order to improve time performance, you can call "dlgradient" with 'RetainData' parameter set to 'true' which will retain the intermediate computation of gradient and any subsequent calls to the function will reuse these intermediate computations. Refer to the following documentation for more information on this.
  1 个评论
SC
SC 2020-11-21
Thanks. I just submitted a new feature request. It is a useful feature for network surgery and it is available in Tensorflow as well. I hope I can use a more convenient partial derivative extraction function in Matlab in the future. Since the values are already stored in the model during the backward propagation, it shouldn't be difficult to allow the user to extract those values as there is not additional computation required.

请先登录,再进行评论。

类别

Help CenterFile Exchange 中查找有关 Statistics and Machine Learning Toolbox 的更多信息

产品


版本

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by