How to use a custom transfer function in neural net training
Show older comments
I want to use a function similar to tansig. I don't seem to be able to find a good example, and the tansig.apply method only allows me one line! I'm wrapped around this axle, and I suspect I'm missing something simple. Any ideas? I'm using 2012b.
Accepted Answer
More Answers (5)
Bob
on 27 Mar 2013
4 Comments
Nn Sagita
on 29 Aug 2013
Bob, I modified purelin transfer function, called 'mtf'. I saved in my working directory. I trained neural network and got outputs. But I got some messages too, like this:
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException at com.mathworks.toolbox.nnet.v6.diagram.nnTransfer.paint(nnTransfer.java:35) at com.mathworks.toolbox.nnet.v6.image.nnOffsetImage.paint(nnOffsetImage.java:49) at ....
Could you help me, what should I do?
kelvina
on 15 Feb 2014
thanks bob, it helps me
but we can directly do this by coping file 'template transfer' from :C:\Program Files (x86)\MATLAB\R2010a\toolbox\nnet\nnet\nncustom
and just replace its function :a = apply_transfer(n,fp) by your function and then save this file in your working directory. it will work.
Mayank Gupta
on 4 May 2016
Can you please explain in detail how to save a custom training function to the nntool directory ? I am using Firefly algorithm for optimization.
Mehdi Jokar
on 16 Jul 2018
Bob, thank you for you instructions. but, is apply the only function that needs to be modified? or we need to modify the backprop and forwardprop function in the + folder ?
Mehdi
Bob
on 10 Dec 2012
0 votes
Greg Heath
on 11 Dec 2012
Edited: DGM
on 23 Feb 2023
I cannot understand why you think y2 is better than y1
x = -6:0.1:6;
y1 = x./(0.25+abs(x));
y2 = x.*(1 - (0.52*abs(x/2.6))) % (for -2.5<x<2.5).
figure
hold on
plot(x,y1)
plot(x,y2,'r')
mladen
on 26 Mar 2013
0 votes
Could anybody upload some examples of modified tansig.m and +tansig folder? This would be very helpful for my project and for other people too. Thank You.
1 Comment
Nn Sagita
on 29 Aug 2013
If you have some examples how to modify transfer function, please share for me. Thank you.
mladen
on 29 Mar 2013
Thank you Bob. Nice trick with feedforwardnet.m (good for permanent use). I've managed to do this but some new questions arise:
- How to use param in apply(n,param) ? (more info-> matlabcentral/answers/686)?
- How to use different transfer functions within the same layer?
- My apply function looks something like this:
function A = apply(n,param)
%....
A=a1.*a2;
end
now I would like to use a1 and a2 to speedup the derivative computation in da_dn.m (this has already been done with tansig.m, but with the final value (A in my code))...is it possible?
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!