Injection noise to CNN through customized training loop

4 次查看(过去 30 天)
Hi there.
I am using costumized loop to train my CNN. For designing my net, I need to inject Gaussian noise per each layer. I could not find in DL toolbox about noise layer and L2 regularization. I need to know how I can put a Gaussian noise layer (if there is) in my model and where exactly would be its place in layers ordering. Then how can I define L2 regularization consist with my costumized training loop (with dlNetwork(lgraph)). I mean, for computing loss function (using cross entropy) and gradient (using dlfeval(@gradientmodel, ...) ), should I add only 0.5*norm(dlnet.learnables) to loss and dlnet.learnables(i,:), where i refers to only weights or there is other approach to do this??
Thanks for any help.

采纳的回答

Shashank Gupta
Shashank Gupta 2021-1-7
Hi Mahsa,
There is no explicit layer for adding Gaussian noise to each layer in MATLAB. Although you can create one custom for you. Also check out this example. It talks about some gaussian custom layer which you can take help from. It will definitely help you.
Also all the parameter in trainingOption can be implemented in the custom loop function easily and this L2 can also. I suggest you to follow up this doc page. It gives a details explaination about how different parameter can be implemented when using custom training loop.
I hope it gives you a good headstart to process further.
Cheers.
  1 个评论
MAHSA YOUSEFI
MAHSA YOUSEFI 2021-1-10
编辑:MAHSA YOUSEFI 2021-1-10
Thank you for your help. I am following your seggustions. Just one more thing about L2 regularization. In the page you liked it this, there is an update for only gradients not loss. I have to first update (adding regularized term to unregularized objective function: loss(w) = loss(w) + l2Regularization/(2N) * ||w||) and then consider the update for gradient as mentioned in this. Am I right?
Also in this link, "N" (sample sized using for computing loss and gradient) was ignored. I think the updating term must be as follow for gradient:
gradients(idx,:) = dlupdate(@(g,w) g + (l2Regularization./N)*w, gradients(idx,:), dlnet.Learnables(idx,:));
not
gradients(idx,:) = dlupdate(@(g,w) g + l2Regularization*w, gradients(idx,:), dlnet.Learnables(idx,:));

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by