Using GPU, multiply a 3D matrix by a 2D matrix (slicewise)
2 次查看(过去 30 天)
显示 更早的评论
Hi everyone,
I am trying to vectorize the code for my neural network so that it can be quickly executed on the GPU.
I need to multiply each slice of 3D matrix X by a 2D matrix T.
X = 3D matrix.
T = 2D matrix.
Is there a good, fast way to do this using the GPU? I have heard it suggested to use 'repmat' to create a 3D matrix from the 2D matrix (duplicating it many times). But it feels wasteful and inefficient.
Thanks!
0 个评论
采纳的回答
Edric Ellis
2016-7-26
X = rand(10, 10, 4, 'gpuArray');
T = rand(10, 'gpuArray');
pagefun(@mtimes, X, T)
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Parallel and Cloud 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!