Running GLMs on GPU
2 views (last 30 days)
Show older comments
I was wondering if it was possible to generate generalized linear models from a gpu to speed up the process since that is a rate limiting step in my code - ends up taking about a week for many models.
training data is of the order [100x10]. Is it possible with a simple gpuArray ? if Not how do I run GLMs on GPU?
Thanks in advance (p.s. I have no idea how GPU programming is done)
0 Comments
Answers (0)
See Also
Categories
Find more on GPU Computing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!