Deep Learning not using GPU

5 次查看(过去 30 天)
Tanmay Rajpathak
Tanmay Rajpathak 2019-7-11
回答: sanidhyak 2025-4-3
Why is matlab not using GPU for deep learning, even though it says that it is?
2019-07-11 11_41_19-Window.png
2019-07-11 11_41_48-Window.png

回答(1 个)

sanidhyak
sanidhyak 2025-4-3
Hi Tanmay,
I understand that you are trying to train a deep learning model in MATLAB using GPU, but MATLAB is not utilizing the GPU effectively despite displaying that it is training on a single GPU.
This issue may arise due to multiple factors, such as GPU compatibility, execution settings, or memory limitations.
Please consider the following workarounds to enable utilization of the GPU:
  • Run “gpuDevice” to ensure your GPU is CUDA-enabled and compatible with MATLAB. Verify GPU drivers and CUDA/cuDNN toolkit installation.
  • Run the following command to explicitly use GPU:
options = trainingOptions('sgdm', 'ExecutionEnvironment', 'gpu');
  • GPU Detection: Run “gpuDevice. If not detected, restart MATLAB and reinstall CUDA/cuDNN.
  • Increase batch size in “trainingOptions to optimize GPU usage.
  • Ensure the “Parallel Computing Toolbox is installed for GPU support.
  • Use “nvidia-smi (Windows) or watch -n 1 nvidia-smi (Linux) to check GPU activity.
For further reference, kindly check MATLAB’s official GPU support documentation:
Cheers & Happy Coding!

类别

Help CenterFile Exchange 中查找有关 Deep Learning Toolbox 的更多信息

产品


版本

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by