How to use a Leaky Relu/Softmax function in a hidden layer in a Feedforward Neural Network?
40 views (last 30 days)
Show older comments
Hi.
I am using a feedforward neural network with an input, a hidden, and an output layer. I want to change the transfer function in the hidden layer to Leakyrelu but the usual command (given below for a poslin transfer function) is not working
net.layers{1}.transferFcn = 'poslin'; % this command is working for poslin
Please suggest the command for changing the transfer function in layer 1 to a leakyrelu. Kindly also suggest the command to change the output layer transfer function to a softmax in a feedforward neural network.
Thank you
Ihsan
0 Comments
Answers (1)
Abhishek Tiwari
on 10 Jul 2022
Hi,
The following is a list of all relevant transfer functions:
% compet - Competitive transfer function.
% elliotsig - Elliot sigmoid transfer function.
% hardlim - Positive hard limit transfer function.
% hardlims - Symmetric hard limit transfer function.
% logsig - Logarithmic sigmoid transfer function.
% netinv - Inverse transfer function.
% poslin - Positive linear transfer function.
% purelin - Linear transfer function.
% radbas - Radial basis transfer function.
% radbasn - Radial basis normalized transfer function.
% satlin - Positive saturating linear transfer function.
% satlins - Symmetric saturating linear transfer function.
% softmax - Soft max transfer function.
% tansig - Symmetric sigmoid transfer function.
% tribas - Triangular basis transfer function.
net.layers{1}.transferFcn = 'poslin';
0 Comments
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!