Gradient clipping with custom feed-forward net
Show older comments
Everytime I am training my custom feed-forward net with 2 inputs and one output( timeseries) with the train(net,....) function:
after ~10 training epochs the value of the gradient reaches the prestet value and the training stops.
Changing the networks architecture is not an option in my case.
Is there a way to implement "gradient clipping" with a feed-forward net?
Or is there any other workaround for the "exploding gradient"?
1 Comment
Christoph Aistleitner
on 28 Jul 2021
Accepted Answer
More Answers (1)
Artem Lensky
on 4 Dec 2022
Please check this link that illustrates several examples on how to implement training options that you would usually define via trainingOptions() and use with trainNetwork() but for customs loops. Here is an L2 clipping example given in the link above
function gradients = thresholdL2Norm(gradients,gradientThreshold)
gradientNorm = sqrt(sum(gradients(:).^2));
if gradientNorm > gradientThreshold
gradients = gradients * (gradientThreshold / gradientNorm);
end
end
You might also find this link useful https://au.mathworks.com/help/deeplearning/ug/detect-vanishing-gradients-in-deep-neural-networks.html that discuss detection of vanishing gradients in deep neural networks.
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!