clamp cross-entropy loss
3 views (last 30 days)
Show older comments
Matt Fetterman
on 3 Sep 2020
Commented: Matt Fetterman
on 6 Sep 2020
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?
0 Comments
Accepted Answer
David Goodmanson
on 3 Sep 2020
Edited: David Goodmanson
on 6 Sep 2020
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
3 Comments
More Answers (0)
See Also
Categories
Find more on Matrix Indexing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!