Clear Filters
Clear Filters

How to avoid Inf values when writing deep learning code?

3 views (last 30 days)
Hi,
I wrote a deep learning code including the following Softmax function. During the training I start to get Inf values (and thus NaN values) in some matrix multiplication operations or as the result of softmax operation.
I also tried other softmax implementations which I found on the internet and books with no improvement.
Having these NaN values even in the first training epoch and in the very initial samples (such as in the 5. th sample) causes a false training of the model.
In order to simplfy my question I didn't add information related to the number of nodes in the input, output and hidden layers, cause I thing that this problem occurs independent of these numbers. If requested I may provide more info..
Best Regards,
Ferda Özdemir Sönmez
function y = Softmax(x)
ex = exp(x);
y = ex/sum(ex);
end

Answers (0)

Products


Release

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!