MATLAB Answers

0

Recursive computation using multi gpus

Asked by Donghun Ryu on 19 Apr 2019
Latest activity Answered by Walter Roberson
on 19 Apr 2019
Hi,
I'm wondering if any built-in MALTAB functions can do the following computation.
variables x,y,z,v
for loop (on gpu 1)
update x
updata y
pass the updated x,y to gpu 2
for loop (on gpu 2)
update z using x,y
updata v using x,y
pass the updated z,v back to gpu 1 and iterate.
As long as I know, parpool or related MATLAB functions can only compute "indepentdent" calculation on multiple number of gpus. Can I ping pong GPU arrays across multiple gpus for the computation of memory-efficient recursive relations?

  0 Comments

Sign in to comment.

1 Answer

Answer by Walter Roberson
on 19 Apr 2019

No.
Within any one process, in order to send commands to a different GPU, you need to select the GPU. Selecting a GPU synchronizes and resets the GPU, and so is a rather expensive operation.
It is not completely accurate to say that parpool and related functions can only compute "independent" calculations.
  • You can use parpool queues (two flavors) to communicate between client (main thread) and servers (parpool workers); if workers need to talk to each other, then the client can act as a "switchboard", receiving data from one worker and sending it to another
  • with spmd you can use labSend() and labReceive() to transfer data between workers
So, the workers can (even if indirectly) communicate, and therefore the GPU calculations do not need to be independent. You can transfer data to take advantage of memory efficiency. But don't count on it being time efficient.
Do remember, though, that every communication between GPU and MATLAB requires a synchronization, which can be the most expensive part of the computation.
Under quite restricted circumstances, you can take advantage of NVLink for distributed arrays between GPUs. See https://www.mathworks.com/matlabcentral/answers/436097-nvidia-nvlink-in-matalb-using-multiple-geforce-2080-ti-cards#answer_353219

  0 Comments

Sign in to comment.