GPU supoort in Multilabel Example?

3 次查看(过去 30 天)
Hi all,
I am currently working with the following matlab example:
https://de.mathworks.com/help/deeplearning/ug/multilabel-text-classification-using-deep-learning.html
My problem is that even though the "canUseGPU" function returns true (I am working on a Geforce RTX 2080 Ti on my local machine),
I think that the example does not run on the GPU. However, In the example, it is stated that it should run on the GPU (line 181 - 183 in the live script):
% If training on a GPU, then convert data to gpuArray.
if (executionEnvironment == "auto" && canUseGPU) || executionEnvironment == "gpu"
dlX = gpuArray(dlX);
end
When i run the code and start training, the task manager tells that there is no activity on the GPU.
Furthermore, I saw that the following embedding function removes the gpu-Arrray property of the data X (line 387-397):
function Z = embedding(X, weights)
% Reshape inputs into a vector.
[N, T] = size(X, 2:3);
X = reshape(X, N*T, 1);
% Index into embedding matrix.
Z = weights(:, X);
% Reshape outputs by separating batch and sequence dimensions.
Z = reshape(Z, [], N, T);
end
I assume that that leads to a CPU usage instead of the GPU.
Can anybody halp me to solve the issue?
Thank you very much!

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Image Data Workflows 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by