Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?

3 views (last 30 days)
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.

Answers (0)

Products


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!