Converting parfor operations to gpuArray

1 次查看(过去 30 天)
I have a working parallel version of a code that does some likelihood calculations on a reasonably large matrix in parallel (using parfar) It is a trivially parallel operation as the calculation is performed column-wise & the parfor is employed to operate on the columns of data (one worker per column)
How could I achieve the same thing using a GPU (since the matrix is quite big & I have limited number of workers). All the operations are all GPU supported functions (matrix algebra ones like eig, diag & matrix multiplications only )
ie.,
data = 1000 by 200 (1000 rows by 200 cols matrix)
[nrows, ncols] = size(data);
parfor ix = 1:ncols
workerData = data(:,ix);
likelihood(ix) = funcCalcLikelihood(workerData, params);
end
This is fast enough. But i need to repeat such calculations many times so as to do a parameter sweep, so any speed increment will be good. Also, since my dataset is getting bigger (ncols = 1500 & I only have 144 max workers)
I have 2 Tesla (c2050) GPUS and was wondering if I could convert this into a gpuArray operation.
Thanks for your inputs.
  3 个评论
nah
nah 2013-8-19
Thanks +Edric Ellis or your comment. I didn't quite get what you mean by converting put data though. Calling gpuArray automatically slices the big matrix by columns you mean ?

请先登录,再进行评论。

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 GPU Computing in MATLAB 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by