How to use a custom transfer function in neural net training
显示 更早的评论
I want to use a function similar to tansig. I don't seem to be able to find a good example, and the tansig.apply method only allows me one line! I'm wrapped around this axle, and I suspect I'm missing something simple. Any ideas? I'm using 2012b.
采纳的回答
更多回答(5 个)
Bob
2013-3-27
4 个评论
Nn Sagita
2013-8-29
Bob, I modified purelin transfer function, called 'mtf'. I saved in my working directory. I trained neural network and got outputs. But I got some messages too, like this:
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException at com.mathworks.toolbox.nnet.v6.diagram.nnTransfer.paint(nnTransfer.java:35) at com.mathworks.toolbox.nnet.v6.image.nnOffsetImage.paint(nnOffsetImage.java:49) at ....
Could you help me, what should I do?
kelvina
2014-2-15
thanks bob, it helps me
but we can directly do this by coping file 'template transfer' from :C:\Program Files (x86)\MATLAB\R2010a\toolbox\nnet\nnet\nncustom
and just replace its function :a = apply_transfer(n,fp) by your function and then save this file in your working directory. it will work.
Mayank Gupta
2016-5-4
Can you please explain in detail how to save a custom training function to the nntool directory ? I am using Firefly algorithm for optimization.
Mehdi Jokar
2018-7-16
Bob, thank you for you instructions. but, is apply the only function that needs to be modified? or we need to modify the backprop and forwardprop function in the + folder ?
Mehdi
Greg Heath
2012-12-11
编辑:DGM
2023-2-23
I cannot understand why you think y2 is better than y1
x = -6:0.1:6;
y1 = x./(0.25+abs(x));
y2 = x.*(1 - (0.52*abs(x/2.6))) % (for -2.5<x<2.5).
figure
hold on
plot(x,y1)
plot(x,y2,'r')
mladen
2013-3-29
Thank you Bob. Nice trick with feedforwardnet.m (good for permanent use). I've managed to do this but some new questions arise:
- How to use param in apply(n,param) ? (more info-> matlabcentral/answers/686)?
- How to use different transfer functions within the same layer?
- My apply function looks something like this:
function A = apply(n,param)
%....
A=a1.*a2;
end
now I would like to use a1 and a2 to speedup the derivative computation in da_dn.m (this has already been done with tansig.m, but with the final value (A in my code))...is it possible?
类别
在 帮助中心 和 File Exchange 中查找有关 Deep Learning Toolbox 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!