MATLAB Answers

Deep learning with a GPU that supports fp16

34 views (last 30 days)
Hi.
NVDIA released RTX 2XXX series that supports fp16 that accelrated training process.
Does Matlab support this?
Thank you

  4 Comments

Show 1 older comment
Joss Knight
Joss Knight on 29 Aug 2019
It is supported for deep learning code generation, but not for general code generation.
Krishna Bindumadhavan
Krishna Bindumadhavan on 14 Sep 2019
There is support for half precision in MATLAB via the half precision object, available in the fixed point designer toolbox:https://www.mathworks.com/help/fixedpoint/ref/half.html.
General Code generation support for half precision data type via MATLAB Coder and GPU Coder is under active development. This functionality is expected in an upcoming release.
As mentioned below, there is no support currently for using half for training a deep learning network in MATLAB. This is expected in a future release.

Sign in to comment.

Accepted Answer

Joss Knight
Joss Knight on 29 Aug 2019
You can take advantage of FP16 when generating code for prediction on a deep neural network. Follow the pattern of the Deep Learning Prediction with NVIDIA TensorRT example but set the DataType property of the DeepLearningConfig to 'fp16'. This will use the Tensor cores on a Volta or Turing card such as the RTX series.
There is no way yet to use half precision or Tensor cores for training a deep neural network in MATLAB. This is expected in an upcoming release.

  0 Comments

Sign in to comment.

More Answers (0)

Sign in to answer this question.