Can I GEFORCE GTX 960M for deeplearning

17 views (last 30 days)
Dear All,
When I check my NVIDIA control panel, it shows that my laptop has GEFORCE GTX 960M GPU, I am not sure I can use it during deep learning or not?
if yes, I only need to set the 'execution environment' to GPU in training option, right?or is there any other setting that I should do?
Reagrds
  2 Comments
Walter Roberson
Walter Roberson on 14 May 2019
Yes the Cuda 5.0 capacity is supported. Memory will be a bit tight for it. Install the latest Nvidia drivers.
Negar Noorizadeh
Negar Noorizadeh on 14 May 2019
Thanks for your response, for installing the latestes version should I use this link?
if yes, do you have any idea that what should I select for 'Windows Driver Type' (standard or DCH)and 'Download Type'(game ready driver or creator ready driver)?

Sign in to comment.

Accepted Answer

Shivam Sardana
Shivam Sardana on 22 May 2019
Considering CUDA and cuDNN installed. To access and get information about GPU, run the following command:
GpuDevice
GPU, multi-GPU, and parallel options require Parallel Computing Toolbox. To use a GPU for deep learning, you must also have a CUDA® enabled NVIDIA® GPU with compute capability 3.0 or higher.
To use GPU for deep learning, set 'ExecutionEnvironment' in ‘trainingOptions’ as 'gpu'. By default, ‘trainingOptions’ take GPU by default.
Hope this helps.

More Answers (1)

Prathamesh Degwekar
Prathamesh Degwekar on 21 May 2019
Hi,
According to the current scenario, DCH is the way to go as seen in the article from intel here:
On the other question, you can go with either of the two. Their difference can be found here.

Categories

Find more on Get Started with GPU Coder in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!