Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)

2 次查看(过去 30 天)
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!

采纳的回答

Edric Ellis
Edric Ellis 2016-7-26
In this case, you can use pagefun. For example:
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
  2 个评论
Brad Hesse
Brad Hesse 2016-7-27
Oh my god! I am speechless! My entire forward/backward propagation algorithm worked, on the very first try, after completely re-writing it to be vectorized for GPU execution. This is at least a 40-50 fold speed improvement (my original code obviously wasn't even very well optimized for CPU execution).
I cannot believe how fast this is.
Thank you so much for your help Edric. I had actually already tried using the pagefun function, but it failed and I assumed it didn't work with 3D matrices x 2D matrices.

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Parallel and Cloud 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by