Why multiple GPUs slower than one GPU?

2 次查看(过去 30 天)
Dear All,
On my machine there are 2 GPUs. Why moving data to multiple GPUs in my case is about 5x slower, than working with just one GPU, environment WIN10, MATLAB R2017b. Here is code and example:
clear;
dd1=rand(100000,200,10 );
cc1=rand(100000,200,10 );
tic
dd=gpuArray(dd1);
cc=gpuArray(cc1);
wait (gpuDevice);
toc
nGPUs = gpuDeviceCount();
parpool('local', nGPUs );
d1=rand(100000,200,10 );
d2(1)={d1(1:50000,:,:)};
d2(2)={d1(50001:100000,:,:)};
c1(1:nGPUs) = {zeros(50000,200,10)};
tic
parfor i = 1:nGPUs
gpuDevice(i);
c=gpuArray(c1{i});
d=gpuArray(d2{i});
end
toc
  6 个评论
Joss Knight
Joss Knight 2018-10-6
You're not just moving data to two GPUs, you're moving it from the client to the pool, and then onto the GPUs. Communicating between processes takes time. Also, you don't call wait(gp) before you call tic which means the copy-to-device hasn't finished when you start timing.
In a real multi-GPU example you would be doing significant computation and constructing data on the pool, rather than on the client. This example is all overhead and so isn't very representative. You would see a similar issue if you opened a pool of only one worker.
Also, you don't need to select the gpuDevice since selecting a different GPU on each worker is done automatically for communicating jobs.
Mantas Vaitonis
Mantas Vaitonis 2018-10-7
Yes you are right. I did not select gpuDevice and did construct data on the pool then the speed improved significantly and it is faster than one GPU. But it is achieved if data is constructed on the pool, but if the data is already predefined on the cient, there is no way to overcome overhead? Maybe you could help me a bit more? In my experiment I would load data from file of size (5000000x300x50), how should I move data to the pool? And what would be the way to divide this data for both GPUs?

请先登录,再进行评论。

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 GPU Computing 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by