Matlab 2018b GPU Training

1 view (last 30 days)
Till now I have been training an LSTM network using the 2018a version of Matlab and didnt have a problem using my GPU as training device. However, since I needed to change the activation functions of my LSTM layers I updated Matlab and now when I try to use my GPU it trains the network way slower than using the CPU, which doesnt make sense. For some reasong Matlab is not using my gpu's memory. Any ideas how to solve this?

Accepted Answer

Joss Knight
Joss Knight on 7 Nov 2018
Edited: Joss Knight on 7 Nov 2018
Do you mean you switched to using hard-sigmoid or softsign activations? This is supported in 18b, but is a non-optimized version since it isn't supported by cuDNN, and is indeed much slower. I would recommend using the default activations for performance, if you can make it work.
  1 Comment
Eduardo Gaona Peña
Eduardo Gaona Peña on 8 Nov 2018
Edited: Eduardo Gaona Peña on 8 Nov 2018
ahhhh ok ok. I was wondering why my CPU was doing it faster than my GPU. Sadly I have to use these activations functions as I want to implement the network in a drive and due to the limitations a hard-sigmoid is easier to calculate than an exponential. Luckly I only need about 1000 epochs. Thanks for the information anyway :D !

Sign in to comment.

More Answers (0)

Categories

Find more on Image Data Workflows in Help Center and File Exchange

Products


Release

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!