Hyperbolic tangent sigmoid transfer function
Tip
To use a hyperbolic tangent activation for deep learning, use the tanhLayer
function
or the dlarray
method
tanh.
takes a
matrix of net input vectors, A
= tansig(N
)N
and returns the
S
-by-Q
matrix, A
, of the elements of
N
squashed into [-1 1]
.
tansig
is a neural transfer function. Transfer functions calculate the
output of a layer from its net input.
a = tansig(n) = 2/(1+exp(-2*n))-1
This is mathematically equivalent to tanh(N)
. It differs in that it runs
faster than the MATLAB implementation of tanh
, but the results can have very
small numerical differences. This function is a good tradeoff for neural networks, where speed
is important and the exact shape of the transfer function is not.
[1] Vogl, T. P., et al. ‘Accelerating the Convergence of the Back-Propagation Method’. Biological Cybernetics, vol. 59, no. 4–5, Sept. 1988, pp. 257–63. DOI.org (Crossref), doi:10.1007/BF00332914.