Adding dlarray support / differentiability to custom functions
2 views (last 30 days)
Show older comments
I am training deep neural networks with custom training loops, by defining a model/network as a function passed to dlfeval(). The model uses functions that do not currently have dlarray support. Therefore I cannot backprop into them to compute gradients for optimization (for example, matrix determinant and inverses).
The only way I have found to add support for these functions is to wrap them in a custom layer (nnet.layer.Layer) with a custom backward() function. However I can only use this custom layer after embedding it within a layerGraph and dlnetwork, together with input and output layers. This seems very cumbersome. Is there a straightforward option?
This may be a feature request: please let us define new functions with dlarray support. The official list is growing slowly (https://www.mathworks.com/help/deeplearning/ug/list-of-functions-with-dlarray-support.html) but the community could very quickly expand it.
0 Comments
Answers (1)
Shivansh
on 23 Oct 2023
Hi Damien,
I understand that you are using some functions in your deep neural network model, and they are not supported currently by dlarray.
I did try to reproduce it at my end and wrapping of functions in a custom layer with a custom backward function() looks like the only possible way. It seems like a valid feature request.
I expect that MathWorks is informed about the issue and will develop custom functions support for dlarray in the future releases.
Hope it helps!
1 Comment
Miguel Rivas Costa
on 24 Jan 2024
Hi, maybe a bit late but. I would like to know how to wrap the functions which are not currently supported by dlarrays in a custom layer. That custom layer should be at the end of the Network?
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!